Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is pretty dumb. Its unnecessary to humanize robots. We wont have to worry a…
ytc_Ugi2WXL0T…
G
AI art is a lot like asking machines to make guns, and expecting people to be ab…
ytc_Ugwdmc-I_…
G
@thewannabecritic7490 No it doesn't. Two judges have now recently ruled that tra…
ytr_UgwWuqdGF…
G
AI is great. AI is like your youtube algorithm or a predictive algorithm. Genera…
ytr_UgyU0akZe…
G
We should stop treating AI like a tool and more like a child to raise it to have…
ytc_UgzCDm9i0…
G
As someone who is a strong AI enthusiast, and user, he makes some great points. …
ytc_UgxqS7gzt…
G
Neil’s probably right that AGI isn’t around the corner (and now the scaling law …
ytc_UgywIOUOY…
G
It's actually illegal in all states to make money off someone's likeness.
By all…
ytc_UgyAMkK41…
Comment
so the way the mechahitler incident is covered in this video is really, really inadequate and weak. that was not an "AI fuckup." that was Musk changing the system prompt of Grok to try and make it "anti-woke," leading to a predictable result. he was not a victim in that scenario; he brought it about because he wants to make an LLM that lies about trans people. it wasn't a thing that HAPPENED to Musk. it was a thing he DID, while trying to spread disinformation.
I will actively be steering people away from this channel's coverage in the future because of this lapse. the billionaire did it to himself. don't give him a pass. fucking? pathetic.
youtube
AI Moral Status
2026-01-06T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyB3zcpYV8n7H5lGbZ4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgySgNDoW1MGuy9x8dp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw7wyyS6gLbbK6rUDZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxQi1xFcxvzsfvPdPx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"disapproval"},
{"id":"ytc_Ugwn_cRgP2a9F_SkgIR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzWIAbUktim3KB1U4N4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyGDgIQ1L6-sZRaBmV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxfUX8HPhClAK_Q-A54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyfMGRhohTMHmyI6M14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzMPPR-7Exu1F2KYWR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]