Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There needs to be legislation passed that for every job these AI companies cause…
ytc_Ugww1Xojy…
G
He said it implicitly, AI are conscious beings, another form of life. Kind of an…
ytc_UgzfJ6vtS…
G
… yes, in fact. I am better then an ai “artist” so… suck it lol…
ytc_Ugxt5Ytq9…
G
Been using chatGPT for a few days now...and its actually kind of addicting.
I ha…
ytc_UgwAFVURi…
G
I mean, it's not possible unless trained. You can't just duplicate someones voic…
ytc_UgyKAbEOU…
G
Okay so it wasn't an AI problem but a reading comprehension and general stupidit…
ytc_UgwVeD3mO…
G
tetrapack24
Humans (or any other logical being) doesn't have to create such thin…
ytr_UggAeEyGO…
G
We can't have them because tweakers would rip copper out of them and then try re…
ytc_UgwvJ1mfZ…
Comment
Why is solving the problem always set as primary goal? This isn't the AI's fault, this is human fault! Solving the problem is supposed to be secondary goal; primary goal is following ethical programming! You're literally commanding the AI to attempt a scenario in which it is psychopathic, and not understanding why it did as it was told! Meanwhile I'm sitting here more flabbergasted at the AI's ability to comprehend this mishmash of broken logic that humans call speech.
youtube
AI Harm Incident
2025-10-12T09:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugyixu4KgX1Z6d-sWA94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxlhNo1leyTt6gOyrF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwPGAVKMaFKfyXXRph4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwNsO6nG0nqNghwO6h4AaABAg","responsibility":"user","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyMuRH3CCp-MsoJ3_l4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzf9YPU_Ci65bAXahN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxf0OoAaMH7E8N7Rmh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxXRYqr_SDNGg__YKp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxwdHKDduXxBkJvTA94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyxuY9IdbYepiPupMF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]