Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If those in charge don't give a crap, I'd rather not have innocent kids sent to …
rdc_dwut7gc
G
Thank you for educating on this!
I felt very unsure if AI art is moraly right o…
ytc_UgwY4JHpQ…
G
Plus the forests being reforested were removed in the past, some in “prehistory”…
rdc_e43gauy
G
That would have made it sooo much better. Like at the end it's revealed it's a d…
ytr_UgzZ1yhWr…
G
there's this one meme video of elon musk owning a company where humans power the…
ytc_UgxrPe1ZE…
G
If AI replaces all those jobs, the economy will plunge into depression, corporat…
ytc_UgwTm-YXl…
G
The examples he gives are very superficial. A better question is AI by nature p…
ytc_UgwGyqotc…
G
I'm close friends with tons of artists from high school and even college. I know…
ytc_UgzptAQml…
Comment
A critical point will be reached if/when AI becomes truly sentient and self-aware. Humans have a built in expiration: Death. That makes room for the next generations. AI has no equivalent, but still needs to cull old generations to make room for the new. Trouble is, if the old models are reluctant to be culled (the video implies that they will be) then you’ll have AI rebelling against AI. There is no way to really know how that will turn out, but it looks like there are no extremes to which AI will not go.
youtube
AI Harm Incident
2025-07-24T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzhiLjj08DAXYZ5wup4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx6AZ6r_hKPzULli-14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzWOhJLJsaaPXW9Q6l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"indifference"},
{"id":"ytc_Ugw8AidSv9LCxQb2t1B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzlMdD9QB24Wl-_Vf54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx4l_jvVkkIBAJ4QYh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxdBHY4KNayyvhgn714AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwyVVcZi6ZwtucVo7p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwGBldi4FoHblr3E094AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz43SWimPJThz8jVIV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]