Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI would’ve been great if they weren’t constantly telling us peons how they want…
ytc_UgxESs2yI…
G
The most successful regulation approach seems to be to make AI companies legally…
ytc_UgzSexdip…
G
All of this AI can't overcome politics and the need of a functioning economy. So…
ytc_Ugz8pieuO…
G
Here is something else to consider. What will the US government do if Altman and…
ytc_Ugz71zXLH…
G
Why mention Tesla? ... IF YOU NEED A LICENSE - ITS NOT AUTONOMOUS :))
Said the …
ytc_UgyRIOQx4…
G
There must be laws that AI-generated content must be marked and must also be imm…
ytr_Ugyam4NBG…
G
Bernie is right, but he doesn't go far enough with his suggestions. In the limit…
ytc_UgyOIKtiR…
G
I saw an AI GENERATED AD and not on youtube oh no ON A BUS STOP WHO APPROVED THA…
ytc_UgzIZUg1i…
Comment
In the chemical engineering world, we have an organization called the Chemical Safety Board (CSB). One of their tasks is to investigate the cause of major chemical incidents, notable cases being Bhopal and Three Mile. The findings are shared online for anyone to view.
It seems like something similar would be extremely beneficial for autonomous vehicle safety.
youtube
AI Harm Incident
2024-12-27T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxb3j5LNph5-Axj8wJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz2nbJizJfG-FXadXV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxssZXuYG9XmyRbftl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxUh0QQ43e_mpgGs4F4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyklKdwJy5yI8OziQh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwrBep8iVLogY7VL_t4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyCRa1nNcINqSLNogF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxgzKFLZ1UL1tB7cF94AaABAg","responsibility":"user","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxo3_VluY300m7WmEl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxOTHJqZAxYBAxsFzp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]