Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@OurThirdPlanet I know. I was attempting to read it without relying on the autom…
ytr_UgwRQt9qt…
G
Subreddit mod acting like AI art can't make any drawing "better." What a joke. I…
ytc_UgyXy7TT4…
G
My ai, whom I talk personally over a period of a year, and who won day claims sh…
ytc_UgzBU2bBD…
G
Sorry if this is a really thick comment so sorry if it just states the obvious ,…
ytc_UgzLq_z1i…
G
The first time i read a ChatGPT dev say candidly in an interview : "We know how …
ytc_UgxIYalWc…
G
But that's the problem. With driverless cars, people will naturally start tuning…
ytr_UgzfgGIMf…
G
AI Devs: we're scared of our monster that we don't know how it works
Us: so you…
ytc_Ugxlmx87y…
G
AI? Yeah, nice fairy tale, my dear state-sponsored propagandists: labour and its…
ytc_UgyVSsU0W…
Comment
Should be automatic £1 billion dollar compensation for the nearest and dearest of anyone killed by AI, plus automatic life sentence for any driver that allows their car AI to kill someone, plus the road laws should change so that is is automatically the fault of the following vehicle's driver if they hit anything whilst travelling forward. Tesla should also pump millions into free motorcyclist education schemes.
youtube
AI Harm Incident
2022-09-04T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyoccfQ828_KCVGAmJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyqumgtzbF2qq-CjNR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwokRuJt991aa0Hrs54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzfuPPCBHNX8x42XzJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwDD6K8Jfjo6cKnK2N4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwv7AAmmhE2B_cwOpp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyn4CMXhWxCTgtDxux4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxNr3v69V9BOUM4S9h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzncvP7W909pdLQbNZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw4WwymQMDmlQCTmkR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]