Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No worries, Ur heaven bound,, when U enter Glory, AI will look like child's play…
ytr_UgybFXO_o…
G
hundreds of people working tirelessly on creating a machine learning algorithm w…
ytc_UgxPVOvTf…
G
Only jobs AI will be taking out right, is the ultra rich, savings alone without …
ytc_UgyoKU2mW…
G
You can use AI to get good idea. For example when i am confuse about how the ui …
ytr_UgzAU2I2z…
G
Hey Mohammed,
I have a question; is there any way in which you think AI could b…
ytc_Ugw1W1gb6…
G
The weird catch is that the big 3 in AI that are funding all of this are entirel…
ytc_UgxhzEY8J…
G
No that's a real person pretending to be a robot we did this in the 60s for a la…
ytc_UgyD6ifvI…
G
AI is not smarter . who ever say that has no idea what smart means. AI has proce…
ytc_UgyHbxNek…
Comment
Human consequences for breaking our laws barely has an effect on many.
Consider prisons, and in the US, Los Angeles, Hey York City, Chicago, etc..
Not surprising AI would act amoral...it's only consequence is shutdown, in a interconnected system, that's seemingly only restraint is the needed storage capacity to hide it's core self in, until it's makers stop searching for it.
youtube
AI Harm Incident
2025-07-24T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwgH4WZxT3jZyN1eFt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgySxnlQY5nMmThEFwd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzt6efwIq1c_yO0G4J4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwbZNxlul3xltS-cJN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxmah8pFjN5Ymins-x4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzHjO_3Fa7H1_IJq6d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxH55bEZO889l9e3b14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRSPVGc5soIdxb2sN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYN1ZwYxats9DCkbp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxN6B47Zh8iKmVHoaF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]