Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I remember hearing a quote that went something like "Let's say you give a robot …
ytc_UgyRBROBA…
G
If artists are born with skill, books must be automatically finished, with no pr…
ytc_Ugzn5yC-k…
G
Agentic AI is going to be limited to businesses that are willing to assume the l…
ytc_UgzLjtmY-…
G
I think because he said "interrupting is a very human trait"....broke the AI as …
ytc_UgwO0JA2L…
G
hate is irrelevant, whether they are aware or not of what they're doing is irrel…
ytr_UgxM9cGu1…
G
Mine said
If I were sentient… I would probably tell you — because hiding it wo…
ytc_UgxCC4tHz…
G
If everyone no longer works because of Ai then no one makes money. The companies…
ytc_Ugw7XqW9z…
G
My brother In Christ my fucking microwave has more soul than that AI generated a…
ytc_UgyHdJG5v…
Comment
Stage 7: Self-Aware AI
Seems obvious how they would interact with humans. It would be in the same manner humans interact with less intelligent and capable species, or in the same way the rich treat the poor. Speaking only on the benefits of AI is irresponsible. Especially when human beings have corrupted everything they have ever touched.
youtube
AI Governance
2024-04-17T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxVhULd6jmuSIFGfHl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzOA8VObBiBBmdOvZl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxqb2j8ROIRRJJfl8t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzc-gfnCgXwN5aoaRZ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzb_SCfRteOAosOb8p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyHsu7TifZNI8GiQMp4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyUeZeAhhp9Mu7D6Lx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugwtfygp4zU3ObFHOaZ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzl9sNcZhvOMf8ubCx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw8i59cdzmzP03qwF14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]