Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don’t mind the AI taking over jobs as long as I get paid for nothing and freed…
ytc_UgwEHhZKJ…
G
Serious question for the anti-AI crowd: What’s effort have to do with art? Since…
ytc_UgxSg4HUj…
G
Personally, I think no one has the right to be mad or offended. We wouldn't have…
ytc_UgwLB_Al-…
G
Its already happening. The worse scenario is what happens if AI determines that …
ytc_Ugyi46Kkc…
G
Im waiting for the flop of ai art so artist that actually pour their heart and s…
ytc_UgzA4lUC7…
G
I won't say it will never happen, but I think it's highly unlikely AI can truly …
ytc_UgxgIZNkE…
G
What a bunch of morons! you should all be in jail!
AI doesn't exist yet, not fo…
ytc_UgxzYlcqz…
G
Thank you for sharing your perspective. In the context of AI and human interacti…
ytr_Ugzgeertd…
Comment
Ahahahaha!! this is so fucking stupid. First of all, safe following distance is a thing, a self driving car would not be that close. Secondly, even at that distance a self driving car WOULD react in time. It would simply hit the breaks at full force, the boxes falling off don't suddenly lose all their speed faster than the car can. But let's just say it couldn't react in time, what would it do? Hit the boxes head on, where the crumple zones work best, hitting a car side on is a great way to kill someone or cause a greater accident (fucking up all the cars in the surrounding lanes). This video is underestimating the reacting time and breaking ability of modern cars.
youtube
AI Harm Incident
2015-12-21T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgicJ8o6vgL9vHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjgjA3QBACveXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgiIRvaFLRy4BXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugi6wxkU3JS5u3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjSjaD1amn_NHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughy05zsMvO4YHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgjFM6BROUj5UHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UghkEkbZMbCpeXgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UggNzTObvFdx33gCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugg5W6YbwRYNMHgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"}
]