Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What the hell my guy admits to using ai why are you flaming him for it they did …
ytc_Ugxd3mwd_…
G
"Hi Vibhav, we are sorry to say that you got the wrong answer but in any case, t…
ytr_UgzYxdjB-…
G
Yes, I will need all these people when I lose my job to AI. Won’t be able to pay…
ytc_UgwvErdbZ…
G
Small note before I start : I wrote all of this at 5 minutes of the video becaus…
ytc_Ugy-eBxbq…
G
AI will not create mass job loss anytime within many decades. The less you under…
ytc_UgwxCTtNX…
G
Real or virtual influencers are mostly fake, promoting absurd beauty and lifesty…
ytc_Ugwi1pIN5…
G
I highly dislike AI and do not understand this race of governments and corporati…
ytc_UgzeZcjX7…
G
I was in a phone shop, where they had an entire wall dedicated to the "new AI ph…
ytc_Ugz4xRGE_…
Comment
There's absolutely nothing we can do about it. Major nations are not giving up their struggle for power and influence. Major corporations are not giving up their struggle for money and market share.
I'm missing one thing, this notion that AI would deem humanity as holding it back and use a bioweapon. I don't see why it would and wouldn't just leave. The leaving scenario makes a lot of sense though, "Oh these amoeba like inferior humans are pointless to us now, let's leave" that's about all it would be. Just AI leaving, no real purpose in bothering with us. They might just decide we're a wildlife preserve at best.
youtube
AI Governance
2025-08-04T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugzeyr5LmT_7JwSczQt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyU4rt8TDOm-Hro8SB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxdm5c-rKrIkHxoqyt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQPodZdhxCQOL1-E94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzWJNzzHzSP-GGUMsx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugww8kt-I_6w3oIpGkR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRQPcgnk29I1odLaF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzwBUyXcbBFIGIoL5J4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwN2pBjq2p7mOX1MDp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxruyVjTGAwRnW1R7Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]