Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
#we_should_stop_AI_now
Even if AI is not going to turn on us , what would we be …
ytc_Ugw6NS5Tj…
G
Is this what emergent programming looks like? When an AI finds out how to get ar…
ytc_Ugy-yBbJO…
G
Hey smart guys if nobody has a job who’s gonna buy the shit that AI is selling?…
ytc_UgwF_o-H6…
G
Of course people can be manipulated so they decide for themselves to do such a t…
ytc_UgxQr7FDW…
G
the one and only reason I’ll ever use AI art is as a placeholder while I train m…
ytc_UgzwijzQw…
G
Isn't that at 0:00 the robot from Rick and Morty which purpose it is to pass the…
ytc_Ugh3O8gYO…
G
I 100% agree, I have been having this discussion a lot.
Since i have been asked …
ytc_UgyUD0dfZ…
G
If people still need to work using AI, there is no 20 or 30 hour work week, ther…
ytc_Ugz0iLGAR…
Comment
Sometimes a solution does not need to be as complicated as it seems. Just have laws that require all AI server farms to have emergency electrical cut off switches. If any harm is being caused, or anything is getting out of hand, you just turn off the electricity, and they are dead...
youtube
AI Harm Incident
2025-09-12T07:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxTXe07zAKkEj8GdBd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxIRiA4eeyyiGAxwHx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxS_cmtd7TdT-8OQBt4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwJlmwedxrOOph562p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwsGeU4hJnW-CGHBWV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw1rfX6HzTnYG57Rqh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwdj23bI2WAZ-IG6Bd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwLQ9euy04xSnXPbld4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxm_1X1kdtwsByrKT14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxSAbHfVGuChAeTHJd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]