Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No one cares about this AI nonsense, it’s another fad like crypto. My job is saf…
ytc_Ugzh6n3yD…
G
Well, hey, my house is already on fire so why *shouldn't* I light a bonfire in t…
rdc_d2zq0zz
G
We actually have to worried about human habit from now on!. Not the AI. Coz ai i…
ytc_UgxC7E94b…
G
Idk I think those corporate class jobs are actually future proofed against AI fo…
rdc_n00tpvl
G
"did you only include the bromide one because you suggested it to someone before…
ytc_Ugz_fZVUy…
G
It is problem with people, not with technology. There is good ai generated stuff…
ytc_UgxnBfj8w…
G
We are going to learn more and more about how consciousness is the essence of th…
ytc_UgyeaZ3Nc…
G
Isn't Ai art just going to inevitably self destruct? As more and more Ai images …
ytc_Ugx5I3VXi…
Comment
The only segment I think is interesting is why people don't also watch car crash fatalities on record without Tesla autopilot (not even just Teslas). We'll readily see glaring examples of 'if a human was driving X wouldn't have happened'. People are idiots on the road and many fatal accidents, autopilot or not, are from human error. If we looked at that footage it would be just as revolting.
youtube
AI Harm Incident
2024-12-18T17:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgywcUDsmSHtKmu2EY94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxR7bWE1QNGoGAe4Bx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKHNHGvCF8TCBofQl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwe8me_sXhvf6h1VpN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgydPcgYpA9Eo0GpnF54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyjk8hF3wmW4-dVXc54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxccQADSSlRhUGeMLB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwnZzC2pi7V_nX2JLd4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgymIYmuz-HlGnRmCmR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgBGDpZa6uRQhpi-J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]