Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
About what you said about talking to ai (that’s what I think it was, I’m not rea…
ytc_UgyQ9dGQY…
G
Who is the "us" in 'it's up to us to teach AI?" This is the cartoon vacuum cle…
ytc_UgwQDPpMO…
G
The lack of accountability for the state of his mental health by his parents is …
ytc_Ugw0HKBy_…
G
So why invite an astrophysicist and ask them about AI? why not stick to their ex…
ytc_Ugz1C80-v…
G
Yeah the programmers that can be overtaken by Ai, are useless anyway and I don't…
ytr_UgwsQkq-l…
G
Let the AI bros be mad. They do tend to not understand what AI "art" actually is…
ytc_UgzfqjV-c…
G
@Manget225 I think so too!! Me and my friends had huge arguments about the situa…
ytr_UgxhmNalX…
G
AI could be used for tiny improvements such as imprecise voice commands, but the…
ytr_UgyW5g_Y8…
Comment
As long as Capitalism is still the game we're playing AI isn't going to save us. It's not going to help us. We're going to come to rely on it and the rich will still be in the position they're always been in meaning that they control the means of production, the access to the tools, what you have to give up in order to have access to the tools. Once we are a government that's actually by the people and for the people then AI becomes our pathway to a Star Trek-like post-scarcity world. Where people's inherent value is in how they choose to use their time and resources as opposed to how they use their time to amass resources for their employers.
youtube
AI Moral Status
2025-07-29T07:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzoiP134-vi1msu60p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw4zQf6Nzef2PndocR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx-Dz3KOv9nBUV9P5F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwSHXFJggXkkAXRYW14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzWbSVGFstlRXT6FKR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxErHozGMeyfTNa5614AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzNLCBwu7Cn37FUYGN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx9N_OlKQOYB475l8l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx-JQdLgeY-4VNfOmp4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw_DG16Q3Vs_1drjxd4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"}
]