Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It seems to me that Sydney was programmed personally by Bill Gates and it was gi…
ytc_UgxmABykH…
G
There is so much misinformation here it is hilariously wrong. No, an AI cannot l…
ytc_UgxfjVnI3…
G
AI like chat GPT has levels if you talk to it in a certain way it will unlock to…
ytc_UgzjxQ4Fo…
G
Even if it’s ai generated- you’re gonna have to do more than be matrix slop and …
rdc_ohp7m7c
G
Speaking of ads trying to give reasons to usa AI, there's this ad that says "Thi…
ytc_UgyeP7cjc…
G
Is this video a low IQ test? Tell me you don't know how an LLM works without tel…
ytc_UgzU5_CY_…
G
it's so amazing that an organic intelligence with a worse learning algorithm can…
ytc_Ugz9rdXWW…
G
yea that guy doesn't look like one of the founders of Skynet.. Not at all.…
ytc_Ugz5T4krN…
Comment
I do not understand how this experte even imagine this van be true, the human Boeing îs an animal he is sociabile with right conditiins pyramid of Maslow, You imagine ai will rule the world and people just sit and accept, 1 month without food ir water and we will burn the mambo jumbo down,easy unplug it
youtube
AI Governance
2025-12-05T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyb25nC-tCNzUzeMxl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwUMcIoDYy9c3Xbhf94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyqJH4PkXHYpQTIvLB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyMvnqawOI2ipyBYfF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwUXHciGqD6FU1_OSR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzA_obx18UsuB1k1lt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxJ7EBZeNbqinTyuDF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwLO9MvSw-d_q97EXx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzDF-2ZWk_CCilohBF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz-cmJbkZziyDnv3mZ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"mixed"}
]