Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"When you're used to slop, it's normal that you don't know how to distinguish re…
ytc_Ugxi6J7JS…
G
It is very weird each time to hear how much of the United States of America is b…
ytc_Ugy7p0C7Y…
G
See it even has the hand tremor of humans. And the voice just sounds like someon…
ytc_Ugz-FEpFW…
G
Asking if an AI is conscious is like asking if a submarine is swimming. (Chomsky…
ytc_Ugx7yKwaS…
G
@DIOftntAI actually requires a constant income of art to stablize itself. Not o…
ytr_UgweCuw07…
G
Thank you for sharing your thoughts. It's fascinating to see different perspecti…
ytr_UgwWVyh6u…
G
Robots in year 3000 will be dangerous they will try to destroy all humanity but …
ytc_Ugy3B4Kgs…
G
What a power move 😆. "You're trying to imitate my art using AI? Here's a tutoria…
ytc_UgyoRyVfo…
Comment
Umm. If AI gets good enough to start killing job markets. What job created afterwards couldn’t AI do? The horse and buggy guy could go work in a car factory. With AGI. There is not a job that we can create that AGI with robotics could not do. And that includes making the robots. Just silly.
youtube
AI Moral Status
2025-07-26T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwIBrIOP07w69PcGx54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzMTH2Aw5ftszyF-5N4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxyHLRnqxlgdAfnnmF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxmHjfk43vDaPpvMsl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxF2OX62UoruOk3n0h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugz6HmMOBUW7SGVih3V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxMWsSpyMbxML688gl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwn4KqlBFtACGDKcNh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzQacBjcli4X-a4Vx94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugyam6RcC2twEmk5XM54AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]