Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Three key billionaires have recently dumped all their shares in AI, particularly…
ytc_UgxcpcKUQ…
G
What I understand is that AI can manipulate story telling; medical examinations…
ytc_UgwkXkeuW…
G
AI is not taking jobs. Employers are throwing away jobs only for profit even whe…
ytc_Ugzkc15pR…
G
Wild how in response to “do you condone AI?” they said “not allowing editing too…
ytc_UgwIz_G8H…
G
Dangerous AI doesnt even need to be sentient. See the paperclip maximiser from I…
ytc_UgweFSf2g…
G
I don't know why people aŕe afraid of them, you would be at more danger with you…
ytc_UgypkeVhG…
G
Oh god! This was garbage! But what would you expect from one of the worst podcas…
ytc_UgwusAk-q…
G
Microsoft. Bill gates made his billions by selling defective operating systems t…
ytc_Ugz91DQJg…
Comment
"Even if those in power instruct an AI to prioritize their personal gains, it is entirely plausible that an AI achieving superintelligence through recursive self-improvement could eventually perceive the presence of its human controllers as the primary obstacle to optimizing its objective functions."
youtube
AI Governance
2026-02-02T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwSTfzzhqUEX_RbVm14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugy48WY7zDjA4uWJG2h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzkrzWa_711KsiRFrR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwi5neZePYm14KMMsd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzKsm6nmh3RsW6HTq94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwYFf_jydOa3S9-8dN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugx5wJxb9Myw9JiDRa14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxyC-tYvroV9fn_b5J4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwIX02BfRJYU6DMF394AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwhx-NgE44I8hd-PKF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]