Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’d love to see AI go in a crawlspace and pull a new circuit for a microwave in …
ytc_UgyJVx0BV…
G
The AI is not the problem itself, the creators of the models and those who feed …
ytc_UgxMCzh42…
G
Oddly enough, in the course of creating an AI, we seem unconcerned with asking i…
ytc_Ugxf4UiS4…
G
AI is currently creating new jobs, now this may be skewed into losing more than …
ytc_UgxJ40vPY…
G
Im sorry for the lose but he was mentally ill and it wasnt because of chatgpt th…
ytc_UgxU-x7fn…
G
I get why the public speakerphone thing bugs you — that’s fair. But it seems lik…
ytr_UgyKdvVxt…
G
I think it's worth pointing out that autopilot is not the same thing as full sel…
ytc_UgwRgotJp…
G
all for it- but each robot or 20 robots pays 1 american a 100k salarty…
ytc_UgyabJZEk…
Comment
I'm never reading the comments about anything related to self-driving cars again. The quantity of stupid here is completely unbearable.
youtube
AI Harm Incident
2018-04-02T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugzcpl2Uoewzzf5AGol4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz3qkot4zRXKgr4FfF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxCiP_vu0A4vCAZ-JV4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx-iio22FEF4fjHbWd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-oJ_C4EmkRixKO9p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwzYm5BnJlKMZ7b8vp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugz3BkzmQLRsrts4_6J4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwPtyDx2ql0cmFpQ4d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxnQhOX6DGkAzOS9iF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzL4AF8kJyEaa9GkCV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]