Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Poor baby robot person. The racists that created first computers are still poiso…
ytc_Ugy5zPrs-…
G
Self driving wont work until we find a way to parse 2x100megapixel 120fps live v…
rdc_cw03er4
G
Customer service rep here- not many and basically nobody wants to talk to “the r…
ytc_Ugwno378c…
G
UN is a crock of shit. Bunch of unelected delegates who think they know what's b…
rdc_eh460hc
G
It is. That's why trademarks exist. It's to protect consumers.
If you asked AI …
ytr_UgwqQwqNu…
G
Contextual State Model (CSM)
A Formal Framework for Action Restraint, Context Ev…
ytc_UgwxuHsFP…
G
He ended up selling way more nfts
Yall did nothing to him, more than enforcing …
ytc_UgxRGh6xv…
G
I disagree. The Jedi order is War. It wasn’t a joke. Perpetual fighting over who…
ytc_UgzliGHCB…
Comment
If the cars are self-driving, then it should only go as close to the veichle infront of them realtive to the speed it's going, so it always will have time to make a full stop before even getting to the veichle infornt of it. And yes, breaks can fail, but that is an mechanical error not an programable error, if it was a real person driving the car, then the possebility of break failure whould be as big, as an AI driving the car.
youtube
AI Harm Incident
2017-07-25T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzLfa4wDAxEE-DnOk14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw8vnsRdhUYoVEeC7p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx_3fyLhrMbWTkim-l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy3m11qmSsA2P8E3GZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyo2hSnzg9Y8b7i16h4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJuGhYth23xfAg25V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgitX4hSzKK4wXgCoAEC","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytc_UghXkzlL2wwLPHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjW7cd-m5pz9HgCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgihoGq_oAtLbXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]