Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Intelligence is the ability to acquire and apply knowledge and skills.
The idea…
ytc_UgwJw2DzW…
G
very few people understand how AI codes work. once it goes to AGI and starts to …
ytc_UgzUx893k…
G
We're glad to hear you appreciate Sophia's approach! It's refreshing to see an A…
ytr_UgyBKzMC8…
G
AI is being used by genocidal Israel for ethnic cleansing. Where’s Daddy and Lav…
ytc_UgwRisYqw…
G
Well, time to misuse it on the people who support this motion and watch them bac…
ytc_Ugy3N6toH…
G
Much of my driving the past few years had been local. However, I recently drove…
ytc_UgyMpCTBb…
G
The story about the AI drone killing its operator was debunked. The story teller…
ytc_Ugyg4zqTi…
G
Sorry but if anyone can be Deepfake does it not kind of loose it’s power? I mean…
ytc_UgySwPQOa…
Comment
The biggest safety issue in every car is the human driver.
Teslas "Autopilot" is only a driver assist system. The Model S is not a self driving car. The human driver is clearly responsible for this crash.
However automatic braking is not new and should have worked in this case. The radar should have seen the obsticle, while the camera failed to do so.
Mercedes for example has automatic braking for quite a while and afaik there haven't been any reports about failures like that.
youtube
AI Harm Incident
2016-07-02T05:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx0qtypp6SCITTSqRl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgznRgyu8fu1bQYeY8h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy1U7irBqxMzWnBSYZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxTorq7V9o2sLu92wJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy0VnwSlCpl-Pfc78h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx1lzNSE5QEBF1gho54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxlDKVGN9w4WFpMXWF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwRrdN4ObLi77vdRb94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwjE1GeJpxUTR3uf9x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgiRdMUEklJtFXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]