Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is the best tl;dr I could make, [original](https://www.reuters.com/technolo…
rdc_imldlcn
G
IF the robots and physical abilities actually does achieve the ability to take t…
ytc_UgxeNOL5-…
G
Couldn't help but imagine the game Primal Rage when he talked about governments …
ytc_UgxJpGTsq…
G
AI isn’t going to respond to consciousness the way a human does because it’s a d…
ytc_UgwjC4vUT…
G
It hearts my heart so much that we're investing all this money in AI, where the …
ytr_Ugx4HWuH6…
G
It’s just like watching a prank video and you can notice it’s already scripted f…
ytc_UgxkWjs1B…
G
Wait, go back European military don’t have no regulations but open AI is not all…
ytc_UgxhO6Vcp…
G
I kinda have the same experience but with my parents, they tell me that me doing…
ytc_UgzILRngf…
Comment
We're not doomed, just stuck in a hole.
The ai companies dug this hole, and society threw us into it. We have the tools to dig out a part and get out, but people using AI are digging themselves deeper instead. They end up trapping actual artists in the crossfire and it takes far more other artists and people/things outside the hole trying to help to get even one person a little closer to being out of the hole. To ai 'artists': stop using the tools to dig a hole, there's nothing at the bottom, and the deeper you make it the harder it will be to get out of when you realize you want the trees and plants and life above.
youtube
Viral AI Reaction
2025-05-14T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyNXUqnstyEU4HOfX54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz-6q9E6wgew-1XlqF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzkZA1Qw6FQxIUF-894AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugys8xADawPGTc9awjV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxQHRdYEFydSiUat8Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzqBS0TmjE-vBqg2AN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwiiOT6TFhoreddkop4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzKYsCaITGVzZmqXn94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy7XqjBhXYkDaK4qh54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugycvi5NPLfsJEwYCE94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]