Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That is not a problem bro thats the best thing we can imagine keep it up ai's!…
ytc_UgyX_M1Ug…
G
AI art will never be better than the art of an actual artist.
Still, there are …
ytc_Ugxt5SRJY…
G
spoken like a true hypercapitalist.
AGI = human replaceable
Goes back to Turing…
ytc_UgxnWFidC…
G
Only half way through but wouldn't giving a robot the ability to feel pain be im…
ytc_UggiiBkWN…
G
Its not AI (yet) its labours moronic socialist policies. Driving up minimum wage…
ytc_UgxiI673b…
G
Program AI on Temple OS. Integrate the Bible directly into their code. It's the …
ytc_UgyDkSpI6…
G
You’re telling me that my job that’s nothing but sending emails and being presen…
ytc_UgyamTHV3…
G
I saw someone saying that being against AI art was ableist...bro...my bromosexua…
ytc_Ugx0K4ydJ…
Comment
It's my impression that people *believe* they can take a nap while these are on 'autopilot'. There was a major incident here in Phoenix, where Waymo was testing and hit and killed a person on a bicycle (the bicyclist was not in the cross walk, and it was a bit dark out) the person who was supposed to be monitoring the vehicle behavior was negligent and not performing her job as the 'human' to prevent it. If she were alert, i.e. doing her job, the outcome probably would have been injury, not death. Could that be the case with these two tragic stories?
youtube
AI Harm Incident
2022-09-03T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw7-HYtNcHnzRvs3R94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzo2oNBN_ODlk4no9x4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwQEKxn-LOhiRGAUSV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwAfeNphLL4V8Nluql4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwc0uu-jsgVHswe7Wx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwbsSTAmGuk6L50rpZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugy4SgqPfyE_aaq6KHh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzYA_EBC1DYHTFejOF4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugycy61gUrhGP2XIY3l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzJlwQK-98RxCDFN9d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]