Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@dabordietrying yes. i feel like youre just assuming im against human artists. w…
ytr_UgwCXJMKn…
G
hey so i support the idea of poisoning ai, but i cant watch this video. its so s…
ytc_Ugzasgyzh…
G
*gasp* Oh no.
People always traced and people will trace, Ai is just a tool tha…
ytc_UgxiRULEH…
G
I was expecting both to be real, but the supposed AI one would be some parody of…
ytc_UgxkEvY6x…
G
I really sympathize with 2d artists using art replicas is not right, i mean befo…
ytc_Ugzxmlbpf…
G
I don't like the idea of AI not just cuz it'll steal our jobs, but cuz it's maki…
ytc_UgwjMXEnM…
G
If 2 different AI’s form form their own language and encryption to lock out huma…
ytc_UgzvlQ-Aq…
G
Honestly saying men actually can't tell when an image is AI. My brother is the s…
ytc_UgwD6hiT_…
Comment
the strange is think about this with AI and virtual reality the "simulation" always getting better and better and people in 100 year maybe dont experience life in "real life" like we do, they properly life more inside virtual reality and they can experience life in a completely different way, life gets much faster, you have all this dopamine and full emotions in much shorter time, the brain and energy consumption could rise very high and lifetime could get shorter except they find a solution to expend lifetime. in the end it could be like in matrix 😅 there are very many paradox ways from travel through the galaxy, life inside matrix to the world gets destroyed. but yes with a superintelligence everything could happen and with chaos theory everything that could happen properly will happen so chances are very high we will not survive that 😅
youtube
AI Governance
2026-04-24T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwIsIktH4PM-RM6n_V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyPM0bWbmFV4LOZCBZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwXJ8WtkZzACq8FRJp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyh1AZIeV7mRjYSaFl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzpDLisQjoF1tecZq94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgziL3JOtIr_pFgwShF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxDLFUDFj3eGq7X6c54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw4075y5pDVgD7C9wp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzOXEb2xqEaItRo0VR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyf_oplWb9lWWbRNfd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}
]