Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I asked CHATGPT a couple weeks ago, “Could a 30-6 have killed Charlie Kirk? He d…
ytc_UgyLbuapc…
G
Tell your business partner that shit looks gross, fake and cheap. Be sure to ma…
ytr_UgzBeV2AJ…
G
I actually do this, I don't always say please or thank you, I do when I'm like W…
ytc_Ugx5309Q7…
G
And I see a dumb ass making a point out from actual nothing. "All ai "artist" ar…
ytr_UgwDbdejO…
G
What does an AI need a paycheck for?
...Maybe make them pay for themselves.
AI r…
ytc_UgzSGqknp…
G
Ai art is not real art and art is more than typing up a prompt it’s how much blo…
ytc_Ugzn6XuM5…
G
The scariest thing is that a.i has no concepts of time so it's not limited by ou…
ytc_UgzOVqN0B…
G
The disability argument is always really weird to me because like... They can't …
ytc_UgxljkNon…
Comment
+Jaden Oldfield
Even if there is not enough time, I believe that the self driving car will still break, reducing the amount of damage and lower the risk of harm to the passengers and other motorist.
youtube
AI Harm Incident
2015-12-08T16:5…
♥ 14
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgggitcG_CbrUXgCoAEC.87WDKCb8uB_87Wc8Git_dg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytr_UgiwrXyY1rZ69XgCoAEC.87WAOkvgKe087ZUsAi0XrE","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgiwrXyY1rZ69XgCoAEC.87WAOkvgKe087Zk-pe2Tvs","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgjFbGyekK77fngCoAEC.87VyK9Y8dlO87WlJMEeWoL","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgiiYSCGtUOQQ3gCoAEC.87VxmXkagiW87W5Qy0QLaG","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_UghlFJ0pJ4lt_ngCoAEC.87Vxapt4mjd87W8NWV8_40","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UghfG4rKbyLwlXgCoAEC.87VxK1IcVdT87W9BZPSPbn","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"mixed"},
{"id":"ytr_UghiyJc91JDD0XgCoAEC.87Vw0Yij8ek87VyZ_yCQtY","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytr_UghiyJc91JDD0XgCoAEC.87Vw0Yij8ek87VzogUdvsL","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugg-LAtK9Y2urngCoAEC.87VuXpLFzck87VxWEbSmkv","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]