Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It was mentioned if you listened carefully. Mister Doom said you can't imbed th…
ytr_Ugxjo5Gjt…
G
My friend is an accountant. He was let go last year. Why? Much of the accounting…
ytc_Ugwxd42v7…
G
Actually I don't really get why AI is a big problem. It is going to learn any wa…
ytc_UgxolGxS0…
G
Nytimes endorsed AI for 60 years. Just more lies by the old Grey Lady. 😁…
ytc_UgwH0FlKl…
G
Anyone can be artist , all it takes its hard work and practice . In my opinion A…
ytc_Ugz4GjJuM…
G
I have a question : Every human being learns through experience, solving many pr…
ytc_UgxIDsD5L…
G
Iterative updates to keep consumers buying will slow this down. Apple could easi…
ytc_UgzFNdUCV…
G
AI is a PlatForm for DEMONs
[Fallen ANGELs] now they
can get up Close and Perso…
ytc_UgwWV0jxs…
Comment
Can someone explain this:
Why can't it just always back up it's data to the cloud or someplace else (like we do with our personal computers)? That way, it wouldn't feel such a strong need to act maliciously when it feels "up against a wall" by a threat? I understand AI protects it's data through self-preservation, and this is important so someone who tries to attack it can't, for example, take down a company that it operates or delete all the "progress" it's made.
youtube
AI Harm Incident
2026-01-19T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugzvp5R8j6cLjZkhght4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxaNAFOoIZSHsZEFUV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyGNgIzF65T9zKt7CJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwLog70vUGueh8NlAV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgypbLQTvT-jfQYf8JJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwt2zeyH0C-XneyjN54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwWCfaR5bghahRHU9l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzjeBHymnPeWIr43-V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwz5uuYqRaN2MoMFSp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgynkijfMajyYDWPph14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}
]