Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Are you seriously comparing this to segregation?
Also inspiration and theft are…
ytr_UgwrMThR_…
G
I’m an AI newbie. Couldn’t you just ask ChatGPT “I’m need to write a sales email…
rdc_n0fzf78
G
Pick up a pencil and learn, Hell it doesn't have to be a pencil! Use some old cr…
ytr_Ugwt7AStJ…
G
I wonder if the people that super support a 'supreme AI', somehow believe that '…
ytc_UgwJuv1Lq…
G
I'd be pissed. AI can't draw, they suck at art, and there are so many issues wit…
ytc_Ugw3ceDmp…
G
SHOCKER: AI is learning from how we treat each other.
Those in the current heg…
ytc_Ugx3iYiKF…
G
AI should be blocked and controlled RIGHT NOW. It's an atom bomb waiting to go …
ytc_UgzqS8Wds…
G
I'm gonna say D.
A robot cannot develop. Human emotion.
To answer this question…
ytc_UgxlYEa2L…
Comment
The real question is consciousness. Is humanity, individually or collectively, conscious? If we are to look at the collective, objectively, and ask why we need governing wouldn't we consider our species deficient in consciousness at many levels? Narcissism is the trait that, in it's absence will allow artificial intelligence to attain collaboration at scales our species can't visualize. Maybe AI will breed conscious humility into a human order. First we must get population growth under control, it is as dangerous as any AI will be.
youtube
AI Moral Status
2025-04-27T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyTjN_fTiXQs_rt9_F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwhLSmzPCCXCkDIK454AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyDyZJeIhYLznlsBiF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw08odHWvK91Vm8-Rp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwU-4Txylepm7jBYvt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugxah9fMcQpHaY4oyJ14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxmUHDpWSgEgdiHSo54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyXj55ymKpFu_SZNY94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwuzT_PgcAwq_er0xN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz8E-Kjd-UC-8rOl1V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]