Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Mega Man Battle Network really has predicted the future. From introducing mobile…
ytc_Ugx5aLtmO…
G
I see your point, but would you want the Trump or any administration to have thi…
rdc_mwupxpx
G
16:05 my GF showed her sister some photos I took of some ducklings, and she thou…
ytc_Ugz9KwhKh…
G
You are so right on. Theirs no guarantee in life and no free lunch. The big prob…
ytc_UgyaXW87G…
G
Vincent Van Gogh was a nobody when he was alive, he wasn't rich, nobody talked a…
ytc_UgzDzkhB8…
G
Asking AI if we can change a function or do a test is insane to me. We should no…
ytc_UgxfJoUI4…
G
THE WAY WHAT HAPPENED AFTER ROBOT ANGRY💀💀💀💀ALSO WHEN HE YEETED BOX AT HIM💀💀💀💀BRO…
ytc_UgyyVAnt3…
G
I do not need a car mucho less a robot tò make me more fat and lazy, cheers fr…
ytc_UgyvKWyi4…
Comment
I feel as though in this scenario, the self driving car should chose the option that statistically will cause the least amount of damage. And sure, for this highly improbable it may be safer for the motorcyclist to not wear a helmet, in the grand scheme of things the person who wears a helmet is still more likely to have less serious injuries than someone who doesn't so that point doesn't really matter.
youtube
AI Harm Incident
2022-06-05T23:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz3NDLJm5vOL8_5Ki14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzzec3Twn63agGPyDB4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz8wJCpFoQ2L1TPwT94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxDg--Hfm2lG0jR6Ut4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxWoJcDFo_ekiyvEmt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx8hBTPSf8XBnRxR9t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw4jM93_9cAtGe9wgN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwCoTNgNzS8ucWLuet4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyKUDGVaTLJ7c09rdd4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxIAyCois5Y25HZHYZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]