Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
UT is the Ai actually creating those ideas or are they just compiling what human…
ytc_UgwuHmMmn…
G
12:06 "very solid understanding" whoa whoa.... Unless he's using a different def…
ytc_Ugyxc90j8…
G
Posting something in public doesn't automatically mean your letting other people…
ytr_Ugzw9qSY5…
G
i am very glad somebody talks about this topic even though the poster boy of dri…
ytc_Ugyyz1_kN…
G
@seneca983 What does this have to do with auto-pilot??? The idiot driver respons…
ytr_Ugz05N2k2…
G
Freedom to speak our minds implies freedom to think our minds. I support Sam and…
ytc_UgyVkSF1z…
G
Yeah, they're only functional when given the right information, but if you just …
ytr_UgwasjHap…
G
This is what happens when you get all the activist teachers and bs DOE, this loo…
ytc_UgwPcyj3L…
Comment
I don't buy it for the same reason that autonomous AI cars took way longer than expected. We still feel the need to hold another person accountable if something turns out to be wrong. The board doesn't want to be the only humans in the decision loop, and the same goes for the people under them, and so on, all the way down to the bottom layer. Doesn't matter if the AI is proven to beat a human at every task that generates money. If things go south, you gotta have your scapegoat.
youtube
Viral AI Reaction
2025-11-24T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwRPPT3Yt14H-9QN294AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy8P2DdeWLFa-RnYft4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyJNogUBXC79PHPQLJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzpMmn5Q0nTVzEC-yR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwxDgWZuOrVeLtcmg94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwYU39ZLo_kYX63PwB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz1I0fCiykJcHvYAiB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgyyJUGjcjh9q1oexLt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOXqRpiQ1fcnqzk714AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw5AVFpkZEpLfuMpjF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]