Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I believe he's talking about solving the moral dilemma.
Here is the nature of …
rdc_dwv6rxm
G
hmm,the corporate language is strong in this one.
but ''colonial'' ai made me pi…
ytc_UgzNqw1Qd…
G
Mexico 🇲🇽 americans .Spain americans. Thailand 🇹🇭 americans .are you livin…
ytc_Ugwrs_CK_…
G
another thing to add as another autistic person is that i also use a lot of art …
ytr_UgxERAJW8…
G
Haha, love the reference! It’s interesting to think about how AI, like Sophia in…
ytr_UgycoXJ8F…
G
The first 10 seconds of this talk makes me question his reasoning. With the adva…
ytc_UgwbngwT1…
G
Any chance I can get the individual totals per AI? I’d be interested to know not…
ytc_UgypuUn-E…
G
We have a bunch of semi-autonomous vehicles on the road. People use them to part…
rdc_g13z1i0
Comment
There is no ethics to figure out, the car should just stop if there is no way around. Automated cars are aware of everything around them, everything, this scenario is not possible unless the people are behind something solid n without looking or on purpose jump in front of the car.
youtube
AI Harm Incident
2014-05-25T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgiWcB2yghriXngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjGpMIElKUcbHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgiQ91bEWOUkuHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiKVLZeD_H-iHgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UggCYfK-9jRh7HgCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugj-SSej1PEkcHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgimsOSI-0reJ3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugh4WDO7or0sBHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Uggp2zaQuKmd7ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjcFgyqVoluwHgCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"}
]