Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Lol. No. We will all end up doing our own plumbing because we won't be able to a…
ytc_UgxxaPUiJ…
G
Sorry I cant share your excitment because u ignore the impact AI has on society…
ytc_UgxEB2DR_…
G
Number one, it ought to be mandated so that AI content must be clearly identifia…
ytc_UgwC33HQm…
G
The problem is also robotics which is developing fast. You can see how AI combin…
ytc_Ugx_pG9Np…
G
We all justify our jobs can't be replaced with ai, I'm a mechanical engineer and…
ytr_Ugw4_mBry…
G
everyone is so scared about AI turning into skynet that nobody has stopped to re…
ytc_UgzsWpeL3…
G
He seems like the kind of guy who wouldn't really understand the human interacti…
ytc_UgwAikpbd…
G
Musk needs to be prosecuted for lying and false advertising. Now Tesla's AI is p…
ytc_UgwzWZljM…
Comment
To say the car has to make an ethical choice and must hit another vehicle is a false premise based argument. The situation is far more dynamic that this. Decisions can be made thousands of times a second. And to say the car couldn't stop in time is saying the car is following too closely already. Something a self-driving car is *not* going to do. This is a made up scenario whose likelihood of occurring is so remote as to remove itself from consideration.
youtube
AI Harm Incident
2015-12-21T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgicJ8o6vgL9vHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjgjA3QBACveXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgiIRvaFLRy4BXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugi6wxkU3JS5u3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjSjaD1amn_NHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughy05zsMvO4YHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgjFM6BROUj5UHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UghkEkbZMbCpeXgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UggNzTObvFdx33gCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugg5W6YbwRYNMHgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"}
]