Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@MrGrantGregory
Thank you very much...!!!
😀😀😀
About robot boxing league I will …
ytr_UgzKIkQoy…
G
Ill be honest, I don't really care about AI art, the people who pay for art are …
ytc_UgxNTmXKX…
G
I think there are 3 big problems with AI:
1) It's currently crammed into and use…
ytc_Ugx1J1tjM…
G
This is the part that disheartens me...I am in the process of writing a novel, t…
ytc_UgyvqzHYO…
G
IGNii7edWhat is your reasoning on that? We can make things stronger, faster, big…
ytr_Ugjx2gfLE…
G
Great video. Maybe it will take an AI disaster to wake us up—just like it did wi…
ytc_Ugx1Ii5Y3…
G
you cannot even form a coherent sentence. you should NOT be talking to addictive…
ytr_UgwqZ8ChH…
G
Your understanding of how ChatGPT works "day to day" is off. The model doesn't c…
ytc_UgwvBGKhw…
Comment
“I dont think people should expect perfection” — wft is that?
It immediately raises the question: why is beta-testing being done on real public roads in the first place? And more importantly — who takes responsibility when a fatal crash happens? “Don’t expect perfection, accidents happen” is not an acceptable excuse. If a human driver hits someone, they go to jail. But if a robotaxi does it, who’s accountable? The company? The engineers? No one?
youtube
2025-12-10T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy9yBdMmNZWUEdmykB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgFZsTlKIF_Kc1WnJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw7FAKSP8mBo-1XGdR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx1RIXXYJ18w8GylAZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQIqOzbKB5dKVn-W54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx24sSiIN8FIlLhS3J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwkzPNDYJEsejt9X1Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxvMWGJXXaftF0J6uh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgwvHaR-WDYCbo-W8-p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyf52QqwMUevQAReYJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]