Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I totally agree, like, ai *customers* go pick up any type of thing you can draw …
ytr_UgzLIBMbm…
G
About a year I go I tried to make AI do some images, not to make art for me or a…
ytc_Ugy6-vZYX…
G
No, Im fed up with 'artists' complaining but not understanding how AI works and …
ytc_UgxWu0cu0…
G
Hey @kidoblivious893, thank you for your hilarious comment! This video is like f…
ytr_UgzvaR7w4…
G
2020 survival kit:
>don't go outside
>wear a mask
>treat your mental health
>tak…
ytc_Ugxf0OoAa…
G
controller of AI 🤣😂🤣 good luck, no one is going to pay you to control something …
ytr_UgxmwG5da…
G
I dont know but I hear on the news that a robot just recently won a marathon…
ytc_Ugy2uz6AV…
G
AI is going to replace pretty girls like you as well, Ghibli is not even the sta…
ytc_Ugx4Y-3oq…
Comment
This hypothetical does not stand up to any scrutiny. Any good self-driving program would maintain ample distance between its self and the vehicle in front, enough room to safely brake under any circumstances and if it didn't then that would be the design flaw. Secondly, in any case, the car would simply brake and the car behind would crash into it, thereby creating a liability for the car in rear. The autonomy of the car has nothing to do with the matter and this is objectively a non-issue.
youtube
AI Harm Incident
2015-12-10T09:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugi6QnAsmjUnJ3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UghFvsP8eJwde3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggeP-aEJVCsH3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgiCtG8B4yTOPXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghRaFbj5Dr8UXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgifStBUPZbFQXgCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgjgWMosOf9y_3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgjA1lZE8jvePngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj5Kvaeg5HsAngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgibjtNUDEehjngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]