Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That sounds great. I wouldn't lift any of the fragments. Just record your own …
ytr_UgxKuG7wJ…
G
Sure, it's not because the economy is in the shitter but AI taking jobs.
Oh w…
ytc_Ugz-Rbu6z…
G
Did this guy just say AI doing harm is good actually because it helps innovation…
ytc_UgyAxzBe5…
G
Soon, we will reach a point where we don't even know whether such videos are rea…
ytc_UgwJyqcf3…
G
I hope people take from this not that AI is conscious but that AI is the produc…
ytc_UgzFHY5KI…
G
$1600 worth of algorithm building annoyance! Take it to the desert and use it fo…
ytc_UgyBLR_zR…
G
Couldn't you also use Nightshade on existing AI images to get around the issue o…
ytc_UgzPu9IFH…
G
How do you think that demand for treatment plan translates to demand for radiolo…
ytc_Ugz4a-XhO…
Comment
Imagine monkeys suddenly deciding they want to control humans - They wouldn't even be able to conceive of how to. Why do humans think they will control AI that is more intelligent, and more capable than humans?
youtube
AI Moral Status
2025-08-05T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugz1WRDw2vm7PGpl-fp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzMLuTBbi6tWWNl9al4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgzAPFpRNO_85va6l394AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugz1gveJEa1JhVJ7O5B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgzsQTKGfqoWWbRL1tV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgwPMOLzZXlSMajF_Od4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugy2WhvU3eZbDCUHbtN4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwQKeXblI8BQm6iPKx4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"outrage"},{"id":"ytc_Ugx-ALnYBj5KnrB-6Dl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgwsRBWGtegtRzBT88t4AaABAg","responsibility":"company","reasoning":"unclear","policy":"industry_self","emotion":"indifference"}]