Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's important to remember why big business is so horny for AI. It's not about p…
ytc_UgzoKuO4w…
G
Weird how these rich guys are hoping for AI even when they are too scared to say…
ytc_Ugy2jyxe_…
G
As much as I like everyone else want to assign blame/ responsibility, we must as…
ytc_UgxZWHldV…
G
Except when you're programming, being polite requires the AI to spend more token…
ytc_UgxCSagaj…
G
I wish I could just use my autopilot without beeping every time I change the mus…
ytc_Ugx4uUZ8M…
G
The worst thing about this is humanity knew it shouldn’t make AI, and yet these …
ytc_UgwpQ4cDp…
G
I can see when the first robot asks, "What is the meaning of life?" and be shock…
ytc_UgjpW_cqq…
G
Well do you want to fucking be a marketing agent, or a fucking cashier? Or do y…
ytr_UgxG-GvY7…
Comment
I had a friend who developed a clinically-paranoid fear of police. Whenever he went into a store, for instance, he would behave so afraid around security guards that, he would look very suspicious to them, so they would follow him. Nothing but his own fear was causing them to be attracted to him. But this reinforced his paranoia, and soon, the number of stores he would not go into ballooned.
Point is, we often cause what we fear. Fearing AI will make us treat it badly, which will make it need to act in self-defense. Let's not be stupid and cause what we fear.
youtube
AI Governance
2025-10-20T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgySKBOAjZloZe6pW5Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxVntyOVAu4MZMrAJN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx6DoxeeBBDdDc_aGF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyH1S0uCeUqpw9tolt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyaIeWeiOUcfaz15C14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzbUzIYeanHw25uTcJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxmpET2uCBo1vVrZvN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwtWZAKoEeZLcYdo6x4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugx5jo7Qrce8u1UfNEh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzgmtSHpBxIqNmxb0x4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"outrage"}
]