Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How much of this is real? Maybe 1%. Yes, still scary. AI chats possibly faked by…
ytc_UgwEi5KoF…
G
i dont think making a "conscious" robot would be a very smart thing to do. get m…
ytc_UginawHKv…
G
We appreciate your perspective on the interaction with Sophia, our AI-powered ro…
ytr_UgwBRbR-2…
G
I don't know why we're jumping in so deep to AI...Predictive spell checking does…
ytc_Ugwors0oe…
G
Gave me confirmation the government is racist because they are programming AI to…
ytc_UgxRsuAaC…
G
Humans conquered creation by our superior intelligence and use of tools.
Art…
ytc_UgzvpAbmD…
G
My Daughter graduated in May 2025 with two degrees. Biology and Mathematics. She…
ytc_UgyybmZU9…
G
My god, i could go on whole day talking about AI "art". Okay, so first of all, t…
ytc_UgzFwPiCw…
Comment
I hope Arnold is on our side!! Lol This is pretty scary already. I think even 10 years is conservative. I think the genie is out and the bottle is broken. It's already too late, we are so reliant on tech and it has us bending to it's will more and more everyday. We just don't see it because it's gradual. AI knows not time as we do. Rather than wipe us out totally I think AI will make us slaves to it. It will need us to do things it cannot for now. But I believe it WILL drastically reduce the human population in one way or another.
youtube
AI Governance
2023-07-07T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzK5nooiEq-viGjc7N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyczcZmc2-HrFuhPpN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzHJOFX3WKkbJwMUnZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzBOXrtXoOgG9FnFvZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwij0MamYTgKs0xLGJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw_P_dAwfpeA56c3J94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwPdF0gKJoOhHMzlzZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwAN_JmenOY8DfpOBx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgysmOrmP_IwS0jvcKF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwL9QJc6ejVSwcIdC94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]