Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Everyone keeps asking if a AI will kill humans, why is that even a question to b…
ytc_Ugz1E468Q…
G
Another analogy I've heard:
Consider this. Ai is burning forests, water, and res…
ytc_UgxskXMYY…
G
Every software engineer knew this from the beginning. AI is a great tool if you …
ytc_Ugy2yvUEs…
G
This was a pointless debate because the phrasing of the resolution was far too l…
ytc_Ugw-YUD7i…
G
“How dare you write that essay on a computer instead of writing it on paper. I j…
ytc_UgwGe4hPZ…
G
Tesla is such a broken company. It's cars are junk. Autonomous driving is well b…
ytc_UgwnVm14K…
G
This may be why one of the goals of the Cloward-Piven strategy, which is a polit…
ytc_UgyTgvD0Q…
G
@CRZY_VESPER AI is inevitable. It cannot be stopped at this point. Being anti-AI…
ytr_Ugw9I9VEM…
Comment
And if its raining hard and you can't weave? The failure is auto pilot. Its next to impossible to teach people not to run is down, we have to remind them hard that we could be their person, so pay more attention, AI will never have a reason to care. And progress for progress sake will demand our doom not the auto pilots.
youtube
AI Harm Incident
2022-09-06T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzIbw6c92D1mwJw7JV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwvAqn6txKzSTq30Rp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxiBYRQr7-tLUD73Uh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgztUyK3QzMS63okyoF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxsaB7VhsHPwDEiMfB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwg2zSxAT6VPSJbZoJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyKxnz4TcPQGMxInv14AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_Ugyo1BdMSzpYX1S21CJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx1kocpJlEG0x7CbMd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz77UtyWjftcu0k1414AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]