Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Should've asked it about drugs. Would be interesting to see how ChatGPT stands u…
ytc_UgxC1JEQC…
G
The largest company are keep investing in more advance AI. The only reason that …
ytc_UgwRRW8sb…
G
Wall street put quadrillions in this. And now the government will have to choose…
ytc_UgxZcdIIl…
G
AI is the absolute worst kind of opportunity for those wealthy and powerful to m…
ytc_UgyaaXCxI…
G
If AGI and ASI are possible we have already lost, it's just a question of when.…
ytc_UgyHFYV4e…
G
Sophia, the AI robot, may possess vast amounts of information and process data f…
ytr_UgxXKXJz0…
G
accelerator pedal was applied for extended period
what does that conclude?
if an…
ytc_UgyJ6fZhT…
G
+ With the help of a chatbot, I will correct errors in my writing if I seriously…
ytr_Ugzuz64jH…
Comment
How about drive the car? How have we become so dependent on AI to handle the responsibility of carrying out an operation where lives are vulnerable? Will this get you out of a DUII charge if the car is to blame for killing someone? Auto pilot is for airplanes in the sky where there isn’t a myriad of objects to hit. They don’t use it on the ground to drive the plane to the gate.
youtube
AI Harm Incident
2022-09-06T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzIbw6c92D1mwJw7JV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwvAqn6txKzSTq30Rp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxiBYRQr7-tLUD73Uh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgztUyK3QzMS63okyoF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxsaB7VhsHPwDEiMfB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwg2zSxAT6VPSJbZoJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyKxnz4TcPQGMxInv14AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_Ugyo1BdMSzpYX1S21CJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx1kocpJlEG0x7CbMd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz77UtyWjftcu0k1414AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]