Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
LAWS & Bills have to be passed that ensures each robot/AI/Automation = UAI Unifo…
ytc_UgzT2FlWc…
G
Restructuring a.k.a. "your position has been eliminated" are just tactical excus…
ytc_UgzMIdiVb…
G
I think the ai is already smart enough to screw humanity i think they are just w…
ytc_UgxQmJ24v…
G
I suck at art but I’d rather be caught dead instead of using AI art…
ytc_UgzPQPpR_…
G
this is not an AI issue, this is a government issue. if you don't like AI you do…
ytc_UgzPFKApQ…
G
The National Highway Traffic Safety Administration (NHTSA) is investigating 16 c…
ytc_UgyyOn5G-…
G
It wasn't a robot it was a big Russian dude who knock him out don't know how the…
ytc_UgydmByFQ…
G
Scraping data for training purposes is fine, it is FUNCTIONALLY identical to the…
ytc_Ugz7ldx2m…
Comment
Perhaps the reason we do not see aliens is that civilizations reach this point and create AI. Head into a Technological Dystopia. AI then wipes out the organics. There is a short story about robots arriving to help humans. They help so much that we die off. The humans are not allowed to do sports that is dangerous, to work that is dangerous, alcohol and drugs are dangerous, sex is dangerous. I see the beginning of this now.
youtube
AI Moral Status
2025-04-30T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyfaovwyjlWEu_6nAp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzdbprQNzN3qHYkh8V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyiFm4ckRfKJsZtgAp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugypi4dSPxVXZCQcF6d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxK_-81-Y3CENJ8ttJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyUDVZmo0Stw3UgraV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxnA1PtPGwpX6DTNE54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzqWDsnKS7Ss3xG3St4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz1rj_YwmTeo_6yWGB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw9qGxUVjuelb7unkR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]