Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why do we even call them AI Artists? Call them AI image generators or something …
ytc_Ugzk_6tCR…
G
AI Creating Tech Jobs Globally. DVLT Datavault AI up over 100 % in 6 mth. Over 2…
ytc_UgwvvT5mp…
G
I dont really care if anyone uses AI to make art. I am a pencil and paper type a…
ytc_Ugw-LBgGJ…
G
I talk to AI like I would have talked to people if I knew they were listening to…
ytc_Ugw7kweR1…
G
The first sentient AI will be produced by some sleazy tech company trying to cre…
ytc_UgwotLq43…
G
i wish andrew yang and andrew ng would fused into one body as andrew (ya)ng to r…
ytc_Ugyk4fZ_1…
G
Well i might be ai cuz i STILL cant draw hands without me damaging the paper…
ytc_UgyDGYZwQ…
G
I actually saw one pr00mpter use an argument in defense of generative AI of "If …
ytc_UgxTVaHZz…
Comment
I think if you bet on humanity being benevolent, we’re doomed. Theres almost no realistic scenario where these people attain this power and will willingly divvy anything out to anyone. Even if this super artificial intelligence turns out to be not even possible, the wealth gap will be so tremendous that we basically go back to super rich and dirt poor but this time the rich have all resources (food, water, infrastructure and currency) and have tech like surveillance and AI to keep everyone in a cage. I dont see how we get out of this besides a total coup of government and systems. Anything besides that doesnt seem like it will be enough to change the course we’re currently on (and headed there at tremendous speeds i might add).
youtube
AI Governance
2025-09-05T05:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxpVgWmGoJyARsmaxl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzA_XDcXYbP9pulHw14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwAvudo8WdxvuEOIyd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxGII4nsOB5n5rcwJ14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzMAV_NNraj8qXBapd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzOh2WsptWSTmlHjRh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzZcrDNmGybBc_hPPp4AaABAg","responsibility":"ai_itself","reasoning":"none","policy":"none","emotion":"fear"},
{"id":"ytc_UgxBbtg8kE7-P6xvFSl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwlJZHhqriFWsH-UDl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwzC3aBA_kla91t8rt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]