Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Karen knows her stuff. I always found Sam Altman such a creep. Have to read what…
ytc_UgxSCow-l…
G
Don’t take all of the jobs from people for AI. We need more human interaction. J…
ytc_UgzzJ8IoD…
G
I can’t believe that AI won’t bring people closer to God.
If you actually learn…
ytc_UgxWqN6As…
G
Killing others requires an emotional motive. Computers don't lust for control be…
ytc_Ugz_ybguf…
G
It’s An AI Robot. Elon Musk Kind Of Warned Us about The AI Robots In A Interview…
ytc_UgyBVz8_n…
G
I asked chatgpt when the big one earthquake in the philippines would happen. And…
ytc_Ugz0gkkrN…
G
A.I. doesn't give a shit....
it will go from A to B destroying EVERYTHING in its…
ytr_UgxAT9c3a…
G
I became an electrician 3 years ago. I recommend, by the way there's nothing rid…
ytc_UgzDE_2QV…
Comment
So basically, the only thing Idiocracy was missing was a horde of super intelligent AI robots doing all the actual work.
youtube
AI Governance
2025-12-06T07:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwQDgo1UqaqPLrXL-B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxDwaFdjGTY7aT3IL54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxYExPBZkpcFkN4kql4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx-CphHBvh7bBT7Skx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyfl7Fwe0xqJZimbb94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzqiFSExkGyp8JKDr54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyG9tHzogryGHhWzWl4AaABAg","responsibility":"unclear","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx3vf1g1MGDtD4Qf_t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwCbDx2-MyjKvqUykJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwRHxlhHsnXGL0hk7h4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]