Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thank you all for the support. These past four years have been really crazy. Thi…
rdc_gbihezh
G
Even if you try to argue that ai "art" is art...ai "artists" aren't artists, the…
ytc_Ugwe4NyBW…
G
Haven't checked today, but yesterday he refused to congratulate Biden and said h…
rdc_gbml624
G
Being a software engineer what i noticed most is the new developers just using c…
ytc_UgyiBXqu9…
G
I'm one of the human's who absolutely doesn't trust Ai and I know it's not 100% …
ytc_UgwHzUUS7…
G
"You wrote that AI could be the greatest risk to the continued existence of huma…
ytc_Ugz0HYwMg…
G
AI art is better than abstract art tho.
For abstract art people pay millions, bu…
ytc_UgxeK4lNc…
G
This self driving bullshit should be banned in all North America , if it doesn’t…
ytc_UgzF_C6bC…
Comment
32:23 it is the same problem as trying to do the same with the human brain. Which neuron connected to what other neuron does what? Well, we are just barely trying to understand that in terms of entire regions of the brain. Much less individual connections. And AI do not even properly have those regions, though some multi network models might be comparable to that.
Why do people commit murder? Where does "murder" happen in the brain? Can you turn it off without breaking what makes them a human being? We don't have any idea. And AI is worse than that. To understand AI well enough to build it safely would mean understanding far beyond what makes us ourselves.
youtube
AI Moral Status
2026-01-08T16:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxXvP06xB_rvHXU8nl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxB2lUMC10V2WCKMdh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzG6m5nNk-ZQp4yPdd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzdLgUpm0zqRww_36x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxXKB0Q9EOyb0TYAQ54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz9jWegCqJ5MLH9GXF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxjHKweqa7s6ZC0JHB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy-KZ4-7G2BKQOny894AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugy88yz9_C5B-z5vALJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx-G5YAEcxVcUZLiZt4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]