Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Great video! Just pay for your robot like a capitalist. People take out 30 mor…
ytc_UgwAqZQVj…
G
its also kinda concerning on the few where death is "inevitable" the ai would tr…
ytc_UgzoqKoj8…
G
I agree with you that there will be less consumers who can afford what AI is pro…
ytr_UgxtdDIwm…
G
That’s wild. Here, let me use this strange data-mining cloud thing instead of my…
rdc_jvkuw6t
G
This is why I am afraid of tools like claude code etc. Its easy to fall into the…
ytr_UgxJtSMxK…
G
To all the AI haters and dislikers let me tell you this ai is not stealing anyon…
ytc_UgwFjxra1…
G
My husband and I were discussing this. If AI was truly smart, it might gain cons…
ytc_UgwJaq-J1…
G
The problem with AI isn’t going to be peril, it’s going to be that AI’s ability …
ytc_Ugw8S9koC…
Comment
The one to blame is the careless person who was suppose to secure the truck load. On top of this, what if people exponentially processes information faster than their physical speed? Meaning they see the logs falling out and know it's about to hit them with plenty of time to react but is also boxed in. What are they suppose to do? Whatever decision it is, it won't be more random than the decision of a self-driving car.
youtube
AI Harm Incident
2017-06-24T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugjp-pcf8PXx8HgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgiIjTXIJ5B6wHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgicjfJMB8sNk3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugghpi3fGQwm63gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UghvDWbrWZGpnngCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugjd8L8jNpzFtHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghC4VdT2HTcDHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugg4Bv06oKkXP3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghZnL5q_KLZvHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggPbviiDUEtwHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]