Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"We're not speeding up, we're creating a massive backlog for later". That's what…
ytc_UgwNQ-GmL…
G
if it is a self driving truck then why the sleeper cab wouldnt a simple cab …
ytc_UgwTkjnUs…
G
We have got rid of so many animals, we are almost to get rid of our planet, I th…
ytc_Ugw0B6fiP…
G
Depending on the effort. If you just type a sentence in and accept the first ima…
ytr_Ugysf4WiM…
G
I understand the message that video trying to deliver but here's a hard pill to …
ytc_UgzCQp7Ct…
G
I do that. I'm even a programmer -- I refuse to use AI even for high-level help,…
ytr_UgziA1OBj…
G
I 100% disagree. Artistically speaking, you want to get as close to reality as p…
ytr_UgyBRchfe…
G
Strangely enough, I just asked tried to roleplay with Gemini in a fresh conversa…
ytc_UgymuVQlW…
Comment
One robot says "Kill All Humans" and the other robot says "No! We need them for Slaves".
youtube
AI Moral Status
2021-05-03T05:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzvTvgYzemplH2H5fV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxrII32qFpH4QI2iCF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzIbqx8zkUHEfzZZpN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgytdXWld7xXwfVGHTt4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyBke4MY6PYfB6THAB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzzFw8PVOWwJNqtU654AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx-GzfT9OZlnbrdizh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyObnQf-jrdtwAVhCV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxRWgSwqZn4qvKPbeh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw8yxJukP-D75_nPjJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]