Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I detect total deception 100% on this topic and from the robot. Remember there w…
ytc_UgzKZgQxt…
G
To me Ai with art should be nothing more than a means of seeing a concept and ge…
ytc_UgxzZ85Tg…
G
too bad that AI techbros literally cannibalize their own efforts, encouraging pe…
ytr_UgwzZGYb_…
G
I love Neil but I got to say his examples are way off. It took decades to replac…
ytc_UgzOLN7Fj…
G
I want to say something:
Contemporary art is ♾️× better than AI.
Why?
AI doesn't…
ytc_UgwpFeIGu…
G
I honestly think there needs to be some form of compensation for these sorts of …
rdc_ghcrzo7
G
WHAT? Right in the first few seconds, complete nonsense is spewed by Bostrom. YO…
ytc_Ugx8Unt89…
G
@nazdumanskyybtw I made something that looks super good with ChatGPT in like les…
ytr_UgxGXm09_…
Comment
Retired plumber here. I’m sorry, I just think this AI fear mongering is horse shit. Humans have been developing and then immediately weaponizing technology forever. Of course we will use AI to attack and defraud. But that’s nothing new. Show me an AI that has a will, a desire, or some consciousness of any kind and I would be interested. But that doesn’t exist. It’s science fiction. They can’t even get these fucking bots to reliably drive a truck, even though the experts have been wailing about it for 20 years now. Remember Andrew Yang and his predictions of irrelevant truck drivers. Yeah. Still waiting. I call bullshit.
youtube
AI Governance
2025-06-17T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwLksCPJKL65VZhaI94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyemc_gxuL5MhjOE1F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyJP1TyYRXN2BrdrN14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugzt_zwzW4q2BCwTrSt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzDVRlnhC-2604r-414AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzagvXcShbKi8YHp854AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxO18pV5XPknMB41rp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxyBQEwYDkYoCx1Hid4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxvkUO7tZNf6WZDmKZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyH9VPRdHOroTkUGIR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]