Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
... Of course one possible combination is:
Fire P1 and A1 and keep S1 + GPT4, w…
rdc_jgi08nq
G
If an AI model is wrong 00000.01% of time and teaches self. Won’t it eventually …
ytc_Ugz5z_Yg7…
G
realistically it will be made such that training data for ai will legally requir…
ytr_Ugw5lDjDe…
G
I am really unsettled by the fact that openAI has to suppress chatgpt so much. I…
ytc_UgzQF57Tw…
G
it is a real concern he has. not about the AI being sentient, but about how they…
ytc_UgzbVae7e…
G
This has to be the scariest comment section I have ever seen in a YouTube video.…
ytc_UgwtPYlEV…
G
I had a long 'conversation' with an AI in which we, between us, came up with the…
ytc_UgzjXyMMZ…
G
I’ve been out of college for (“only”) nearly 5 years and I feel like such a boom…
ytc_Ugy8_G6d5…
Comment
I work professionally as a programmer, that is my job. And I want my car to be so little automated as possible. I don't want it to break when I am about to hit something even, for that day when I actually don't want it to. I hate that today we so often have to fight our devices that tries to prohibit what the user wants. I would say Mac is the worst offender but Windows is getting there as well.
I fear the day where driving will be illegal and only self driving ai cars will be allowed, it will probably come to that though..
youtube
AI Harm Incident
2022-11-20T20:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxWEUMaUEDYyGIjoQ94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxdK8TU5Di3ES4ElLZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxAmzlCFmrZmxMHRI94AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxQ3BRizlPG8DSnvyx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzefO_F95ohs0NlSdp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"disapproval"},
{"id":"ytc_UgzG4obmJAZGFm4HiGR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwFHp8SUlQu3Uj7rex4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwB5QsAllfNxW11pJN4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxM8a7Nd1Qit5aSspR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxhzhse5PJvVG9QIF54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]