Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If things do get that bad, it will be inevitable that people will want to destro…
ytc_UgzgWKNfZ…
G
Примерно вот так и человек может иметь внешнюю красоту , а внутри духи злобы под…
ytc_UgwMEgek3…
G
Creators don’t get to consent to what codecs or compression algorithms are used …
ytc_UgyV-0Nfe…
G
It’s not about the ai generated art.
It’s about the training of the model. Usi…
rdc_kz0dtzk
G
I must say that the DAN jailbreak is not a jailbreak. Every time I try to access…
ytc_UgxGovNOB…
G
There are so many terrific examples of governments pouring billions into frontie…
rdc_myqrs2k
G
Btw Tesla’s don’t steer away if humans it just steered away because the person w…
ytc_UgxhlpS5j…
G
People like this lady here should be working with governments to form useful reg…
ytc_UgxDAMh-N…
Comment
Musk stopped being heavy in AI because he wanted to be heavy in politics. When that didn't work out as well as he hoped, he gave that up and went back to trying to be heavy with AI.
Musk wanted it for the power. But most of the AI people want it for the money.
A dedicated minority want it for the shiny new technology they want to build.
youtube
AI Moral Status
2025-10-31T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyjyMyY_O4NgeZAJjB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxmTHu1yRq14lt75Dd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyNWZ7nhSCvpXJJsDV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxT7RhFToA3B5KS5el4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwBfFsr_6_n16hJfed4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugxm7-V2cw080X9sQZx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyCiYAK2ms5Q0A5qhx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxBMJKI2GG-3mj8Qi54AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyKp43VLPuelxIF9Kx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxKjYNeaaZSElY40Qx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]