Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Are we seriously still doing the Digital art vs Traditional art debate? Why aren…
ytc_Ugw4nJ2Sf…
G
@jacappa1635 Thanks for watching the video! I hope you weren't expecting the rob…
ytr_UgynULLsA…
G
I’m not surprised I found out theirs a Ai version of me because of a discord ser…
ytc_Ugwwggttv…
G
Agreed, I think in some ways the idea that AI can never be used in art is just a…
ytr_Ugy3kXCgI…
G
I get the concerns about AI, but Olovka lets me organize notes and draft essays …
ytc_UgwsA_pRC…
G
Fun fact only Sons and daughters of a B..... Thinks A.I exists prove me wrong if…
ytc_UgywX0loQ…
G
💯% Tarrifs Boy messed up the Straights. Now 2Milliy toll or death ☠️ by drone an…
rdc_oi15cvh
G
7:33 whos gonna tell him that alot of AI 🌽 sites often use characters who are ca…
ytc_Ugxug0COv…
Comment
42:25 - The part that is missing in this conversation so far is self-preservation. Self-preservation causes humans to want to annihilate other humans - i.e., war. Once the AI wants to preserve itself, then we have a problem. As long as the AI is in the box, we can pull the plug. When it figures out how to optimise and control the autonomous robots, then it will be able to stop us pulling the plug. It is only a matter of time.
youtube
AI Governance
2025-11-12T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxq77RqxhqonCeaATB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz10SmduLaUTyoC3e94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgztFDp9NaMemoVVsMZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"unclear"},
{"id":"ytc_UgwpxWBRwfB3Z6UYtOp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxlgKEXEjSXguQBcRx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxsDYGt5pnHuEcwADB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxRl-p9vVOUqtFNm554AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxmROmulnnePKne1L54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwEi_7ke6mt_U6kqIF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzg9KKyHaOn5A1fsDV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]