Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Robots should be left in the movies where they belong. Why would a civilization …
ytc_Ugyk-ivwV…
G
Instead of debating consciousness, we should address concrete AI threats such as…
ytc_UgwAgGk38…
G
If you pause the video when it compares the AI video with Hotel Transilvania the…
ytc_UgwY5w3WW…
G
So i have newterms seeing how all developing
IL:
Internet Libraries | Huge data…
ytc_UgyZyZZn7…
G
my answer whenever anyone asks me what i think about ai has become "i'm not gonn…
ytc_UgwF1esIS…
G
We appreciate your engagement with the video content. If you're interested in ex…
ytr_Ugw2JuO5v…
G
Still waiting for AI to tell a basic history story without f i ng it up in the f…
ytc_Ugwa7kCZR…
G
My question is this. If you truly believe that we are living in a simulation rig…
ytc_UgwQayrU8…
Comment
A supposition: wouldn't it be logically sound to assume that "motorcycles" would be outlawed by the time "self-driving automobiles" were predominant on the road; by design a self-driving car would be judged the safest conveyance and it is more likely law makers would outlaw "unsafe" transportation, along the same lines as horse-and-buggy are not permitted to be on a highway simply because they cannot safely drive at speed with the rest of traffic. One could speculate that a motorcycle too could be self-driven but without the added protection of a car frame to ensure occupant safety, I really think they would be either restricted to low volume roads (such as side streets) or outlawed all together.
youtube
AI Harm Incident
2015-12-20T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgicJ8o6vgL9vHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjgjA3QBACveXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgiIRvaFLRy4BXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugi6wxkU3JS5u3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjSjaD1amn_NHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughy05zsMvO4YHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgjFM6BROUj5UHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UghkEkbZMbCpeXgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UggNzTObvFdx33gCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugg5W6YbwRYNMHgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"}
]