Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You also have to take into consideration that the ai isn't human. It will view t…
ytc_UgzmCb1yM…
G
As we spread and do our part in protecting the environment and the future of exi…
ytc_UgzVB8CpA…
G
Ai first has to be programmed its the programmer's fault the AI is racist. compu…
ytc_UgxTHhY72…
G
I think we can use the fact that AI could be smarter than humans by asking it to…
ytc_Ugx0bqVx6…
G
Ima just tell all the bill collectors calling me that I think theyre AI and scam…
rdc_oi42qso
G
I asked Claude if he could identify if it was me giving him instructions, or if …
ytc_Ugxic-rFM…
G
Damn, I loved NaNoWriMo as a teen. This is insane to me. The point of NaNoWriMo,…
ytc_UgwbL8_EI…
G
Its not true A.I. til it can write its own code. What we have now are programs n…
ytc_UgzfBK9IL…
Comment
A robot does not care for justice. A robot cares for what its programming says. If the programming says hurt the least people, it wil do that. If the programming says save the driver at all costs, it will do that, though likely at a coin-flip. Don't let the self-driving car make that decision. This is why we need manual override, so the user can make these decisions a robot can never hope to answer.
youtube
AI Harm Incident
2017-07-09T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugjnw_pI28jYpXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghwOGDepVXCWngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj3YY9osWlB4HgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UghifWP6y7_ogXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg4ldklSPeo8XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjywnlXpJLqFHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgjokRxbpwiSqHgCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghBFWCU-Fp7bngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjOOT8Vua498ngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgirnfINQGNpP3gCoAEC","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}
]