Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Never exposed to this idea before seeing this in my feed, my first thought (have…
ytc_UgxjrwERo…
G
Personally, 1 think that its base on our (human) comments about AI that make AI …
ytc_UgyrGFuZa…
G
Important distinction, we dont have true AI. We have VI. These are advanced chat…
ytc_UgxKJQ2rL…
G
I saw someone quote Irobot "Can an AI turn a canvas into a beautiful work of art…
ytc_UgxIw69Vc…
G
''a.i. won't be able to fully replace software engineers anytime soon'', meanwhi…
ytc_UgybvUDkz…
G
I'm pretty sure AI could watch all of the Breaking Points and Rising episodes Sa…
ytc_UgykQyTCR…
G
Hi Roderick, you got the right answer. Kudos.
The contest is over and winners ha…
ytr_Ugzot1eVH…
G
AI is only as [ insert anything ] as its makers and subjects it learns from.…
ytc_UgyUjEA5u…
Comment
I still question the usefulness of self driving cars. You forgot to mention potential glitches in the software that can be even more dangerous to the cars decision making. I get ticked when people think a better future a freedom is through advancing tech to were they want more and more government involvement.
youtube
AI Harm Incident
2018-10-18T12:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxNu6orq72VYmfHfwB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzZh_afhC_OOFGLQXR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyCTPsECAP96PGh3Hl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"resignation"},
{"id":"ytc_UgxJKKQ9sqMp_Ti81H54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxqNM8gW2hHoyqHe5l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxnIUdclFIQ-4Rv2z54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzlBBbufEX3_0ASe354AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwCGqFtlXMJ6s8DwaZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwtXdfQuJN50FtVCfZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyosQqVFTeUat0GwLx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]