Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
بل گیٹ فراڈیئے نے Philanthropist یعنی انسان دوست
ھونے کا جو ڈرامه پچھلے 50 سال ر…
ytc_UgzIukVL6…
G
Far more dangerous than Nukes? Easy to kill them I will tell in steps: 1. Get a …
ytc_Ugwj9-tTg…
G
Why is AI so good at deep fake p*rn of innocent people but so bad at curing canc…
ytc_Ugwcio7ju…
G
I am afraid this problem is only going to get bigger. Massive man. Once there ar…
rdc_mlj8yj5
G
I would be the last to person to entertain the idea that Demons exist, but if th…
ytc_UgyYT6Tg1…
G
LaMDA is NOT an AI. It's not doing ANY reasoning. It's LITERALLY performing pat…
ytc_UgzDRwQHW…
G
Why bother at all, if we are already in a simulation, AI has won, and it’s too l…
ytc_UgzOYM-l3…
G
my conversations with AI often deep dive into metaphysics or concepts of conscio…
ytc_UgzfHfAbg…
Comment
I mean we don't treat AI with caution the same way you don't treat flies with caution. They aren't yet worth having caution over. Those designing bombs tend to show caution because they aren't making snap-pops they're making bombs.
youtube
AI Moral Status
2025-10-31T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxulKGJi86wcT0kDzF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxx2bURL3blvxVxZQZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwR7p97wVP-tPwq17p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzpdD3iI1J9uBK_b0B4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwk184PxRN3wdcOYzt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxir1tqHkzbkDm6jLd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxNVOH9G5701G7oaQt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxd0s0yoZMjnfOv6QN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzUuGtUdClySVisWrF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzu3eh73nscrGi7bxN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]