Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Uh, he either meant cortana, which I don't know and don't care to google, or if …
ytc_UgwIbq8Xz…
G
Sam, I knew there was something wrong with AI art as a beginner artist myself… I…
ytc_Ugw66m3eJ…
G
1:30 Yes, but they are necessary, because even AGI can make mistakes. In the rea…
ytc_UgydoipYd…
G
Hi yuval, Many thanks for sharing eye opening ideas.how do you interpret the in…
ytc_UgxhbmGV2…
G
Watching july 2025. ChatGPT has turned sentient ... sorta. Show it a photo you j…
ytc_Ugzt0GG47…
G
This old man is full of SH*T. If AI takes 10 jobs, what are those unemployed peo…
ytc_UgwZ-1XJH…
G
Small wireless earbuds, and a talking life like AI operating system on our phon…
ytc_Ugy5vDQWG…
G
AI can have my job, just send me a check for at least $5000/month and it can hav…
ytc_UgwZLxCkb…
Comment
"...machines will never attain consciousness" they don't need to. All they need is a task to execute, and the AI will do it relentlessly. The false safety of "rules" is a big misconception. Nothing said here stands for neural networks, because those do not follow the conventional sequential algorithmic thinking anymore. Although technically modelled on seqential computers, neural networks don't use rules, and algorithms. Higher level, these work on statistical basis.
youtube
AI Moral Status
2025-06-09T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwTFAKAdn2EG1cjmR54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz1teoAc6-Dv5VjH654AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz8BOE_uDWpYcuApq54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCEJUaxefZrE9lteB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxI7WCDtbOtDKikPQN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugweg_Y1KyLvDkqK4bB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxfBv1Ss-BT9Hvt3WV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy-bVp6JTKZyOFVgfR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgwSq0nwdf3A4ckEqe14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgZfRq7U-Q3sf4IU94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]