Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Is it just me or when Alison played the message of her voice the way she spoke i…
ytc_UgytIJD2x…
G
Did they call themselves an artist or am I missing something. Because it looks t…
ytr_Ugw_wKB59…
G
Google now generates 25% of their code with AI internally. Do you think Google e…
rdc_lz5qppz
G
Not saying Trump is going about it the right way, but the coming AI job apocalyp…
ytc_UgwmOIVTo…
G
What life will be like in 100 years will be heavily dependent on several major i…
rdc_oi380pr
G
i used AI art to do reference images and inspire some character designs "a blond…
ytc_Ugxgxxwts…
G
No worries, there's no future for any of us so no need to plan ahead. That's the…
rdc_nbp02uz
G
Can AI in the future help those besides CEOs?
Yes.
Will they? For a hefty pr…
rdc_ohz5sst
Comment
I view it like this…AI yes it’s artificial/digital/computers whatever, but we’ve gotten them to the point where they are replicating human intelligence and so we are essentially creating an endless amount of brains on this earth (or people/agents), and what happens to a world with overpopulation and too many ideas that just cause more debate and conflict…I say unplug AI right now, make the sh*t illegal, like Dune we can have some folks have access, but not the general population…look what weak gun laws have already done to America…weak AI laws is really going to make society unsafe….but of course AI will help find a cure for cancer..that’s even more people on earth
youtube
AI Moral Status
2026-03-01T06:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzHRerdztEfbxVwzp94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy8WC6Ga2hTc1XWSLJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyzhmwMjZKNEAlyJn54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzyi6rGX5WjEVJwgix4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzItwd5phga2CZZ1qZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyQhKze8Ue9ng7UgX14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyZIHf8JPWBSGS2bP54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxKXdE5ekjTgPyLhXN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw0rfX4KN4ZvYpzmdp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyZdciUVdoBb5cHTEB4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"resignation"}
]