Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You should watch "Message to Planet Eden 2025" a Christian message about AI and …
ytc_UgykG6dnr…
G
Honestly i also see nothing wrong with using ai generating for content like game…
ytr_UgxAj8Jzw…
G
There is something called style. If your work is something that can be made with…
ytc_UgzztBLtF…
G
Maybe AI is not a good thing if it is a cannibal of others efforts and work? May…
ytc_Ugy6TVIFF…
G
AI, Robots & 5G are all weapons against humanity. Please don't allow them 2 rol…
ytc_UgziuaL_A…
G
When confronted with the dilemma, how many AI will chose the imaginary category …
ytc_UgyURZ6p8…
G
Ai ive become death destroyer of worlds love of money got us here cant eat money…
ytc_UgwrKcsgf…
G
"Artists should just quit, theyre useless." Hey bud, where did your algorithm ge…
ytc_UgxESPB3s…
Comment
Kinda felt like these robots only said commands that were triggered by keywords said by the stoned out dr. Would they once evolve to comprehend visuals to make their own decisions? If so how you hardwire the android to take humour or a joke. The more advanced you make AI, they will never understand humans because there robots, with no emotion. That being said the only good idea to me that brings robots into play are jobs that are far to dangerous for humans to complete. They don’t need to speak they don’t need to free roam. Just literally do the task they were created for only.
youtube
AI Moral Status
2019-11-11T09:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxXg339nGK4Ig2Xu_J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxowXB8auIeAE2-EnN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzzAb1sItOr5kvOw1l4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyoZXWT0yxjbJDrtxN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzs4Nr4q4nAtXn-R1h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyDk5HwrVjeMuOwKXV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx5glvnB8nukdO9ERh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyQHJQPZrvuxzNDbc14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx13mxDUPagO7vWqOh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzRFph_isQl6unSkm94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]