Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sadly they found out you can use a gaussian blur at like 1% and it solves all th…
ytc_UgwWVIFxE…
G
I cannot care for this admin, but Federal regulation of AI makes way more sense …
ytc_Ugxb6-Wmx…
G
It’s not the self driving car that’s the issue, no human or technology could avo…
ytc_UgwFKh2kh…
G
If someone saw any of my A.I chats they would actually probably be a little scar…
ytc_UgzTpuVud…
G
The "statistical" explanation is a big over-simplification and doesn't account f…
ytr_UgxnduCsu…
G
I'm amazed so many intelligent people are OK with AI. Personally I want nothing …
ytc_UgyNbGmUC…
G
Don’t you think we quite often misinterpret down right arrogant stupidity and bl…
ytc_UgzO0pXSR…
G
I like how Poly can instantaneously create multiple essays of lore about a bunch…
ytc_UgxvsozSe…
Comment
Well my theory on artificial intelligence gaining conscience based on movies and shows is that the AI will continuously learn any info at the same time adapt to new knowledge. So when you teach it psychology, ethics, and current world problems and catastrophe, it might consider humanity as a whole as potentially destructive to both the environment and itself. Although it's for from possible we need a special kill switch should this happen
youtube
AI Moral Status
2023-11-03T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz4rz6oohEklsoaAxF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzHaqc5UM1D2fFoYsV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwWyEX4pwQPQtbjGTx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyBW5qCeGwLGSA1AOR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwGEfNias2R_zZHXN94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwegUS4AoNJ_xfBLvl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzisOHRqFKJezwFLnh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxywwwY79Nf0X5Jw0x4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx1uk3FdyYEAwHDJ9d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw_A4fzfl2EHPGRghh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]