Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI does not need to outperform a human. And software that outperforms humans has…
ytc_UgxIWhTHR…
G
Until AI is REGULATED, corporations won't adopt it due to the risks and liabilit…
ytc_UgzHu4Avc…
G
Jobs to survive AI? Jobs that make connections with people. Skilled jobs like we…
ytc_UgxUzltst…
G
I disagree, you can't have privacy topics without touching on the security of ho…
rdc_koylksz
G
Okay honestly, if they’re like programming from 0 to 100 and making art then id …
ytc_UgxIHidcd…
G
IVE BROKE THE FILTER MULTIPLE TIMES IT DID NOT END WELL I TRIED SAYING SOMETHING…
ytc_Ugz_mBABF…
G
Yea but the AI doesn't have a conscience. The boundaries set into it are its "co…
ytr_UgyvKbhGI…
G
For now, all Godfather of AI talks about is a pure fantasy. The transformer base…
ytc_UgxGvkQVg…
Comment
Basically computer chips are going to eat us all. I've got an idea, Why not round up all the tech trillionaires that keep pushing this AI crap down our throats, and make them to eat computer chips for dinner every night if they love it so much. I don't recall ever being asked or voting to approve the wholesale takeover of our streets and lives with robots, drones, self-driving cars and all the other nonsense. Nobody ever asked us if we thought this was a good idea or ask if we want this or not. Nobody. Why?
youtube
AI Governance
2025-08-10T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzfQdT58BPiDy5daDB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgydZW2l5QKvh6HuDGF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyH-ByG9G9kpLGUOK14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxBCREkQ-z749LJGyV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzSf5AJtPAyluVHHQ54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz6861p--5-kY1H7z94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxZL9FZ2NDUJ0aZyFJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxxWDPgunOQTvsNBS94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxb8zXARZMgGSibYHJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzTPmXNv6u_1bx1rNJ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}
]