Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I understand what you mean, but the burn out process of writing is part of why I…
ytr_UgxzDt5bM…
G
With all honesty, I hate this man speech... sounds like someone who wrote the bi…
ytc_Ugxhpjs2Q…
G
I was agreeing with everything he was saying until the end where he starts takin…
ytc_Ugy0KV9yz…
G
@CALndStuff dude there's been incidents where ai has killed there operators a…
ytr_Ugz2pUaPk…
G
Commenting to increase engagement and help the algorithm spread this. This is in…
ytc_UgyEIZpkI…
G
AI: misidentifies a bag of chips as gun
Entire squad of cops whose times were al…
ytc_UgyNIwXFk…
G
Exactly, then these TOXIC older men online telling young boys to stay away from …
ytr_UgwHeWatK…
G
I work in tech support. I've told friends, family, and customers that they BETTE…
ytc_Ugyps4bNx…
Comment
I thought that was a very good comment on what happened. Even though I don't like the whole idea of autonomous driving cars, I can't help but think that the driver was more at fault than the car. He was supposed to pay attention and be ready for any danger. If you argue that he was, then that means he would have hit the woman, even if he was driving a regular car. Before I feel autonomous cars are safe, there is going to have to be other safety devices added to the driving environment. For instance, cell phones could have some kind of a signal that the cars could pick up, since most people carry a cell phone. Also, some of Nikki's suggestions were good, like infrared detection.
youtube
2018-03-21T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugwt3BiPlp9ro1_hrdV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyZciDCmXZTzdJXt9R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgywelGNAIfKRx4HQzh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyu-4QIfI2bE30ktBh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzdMJJiR5Z07vC0dpx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx0WubZVZkPIVv2T8h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugz30mNkDrc7bou2Uo94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz1XmMUdyiR38KXsat4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugzqq1iviYPq9eVz1wJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzvXVSUeIuyd2bDDwN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"resignation"}
]