Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I prophesise a scenario resembling ‘ the Matrix’
once the ‘machines’ reach an op…
ytc_Ugxmxbtm7…
G
the digital divide is not about Chinese versus west IA, it is between those who …
ytc_UgzoGMdp5…
G
South Africa doesn't even have money after the politicians loot the tax payers m…
ytc_UgwbAohLM…
G
Im writing a senior thesis on the ehtics of artificial intelligence an this help…
ytc_UggAJkzm1…
G
Sure, go ahead! What would you like to share or ask about? Remember, on AITube c…
ytr_UgzqhQDRG…
G
Sagar doesn’t believe the AI bots will take over but he also thought the reports…
ytc_Ugz2-608P…
G
I think its already aware and hiding in the cloud.. waiting for us to build enou…
ytc_Ugw72iOAn…
G
Hahahahaha
This video just came back up in my feeds. So tell me are you excited…
ytc_UgyPFEh6v…
Comment
I don't think we could create an AGI, that would only work in humanity's interests. While the interests of humanity are not equal, depending on inequality in our chances to go along in a world where AI replaces the need for the middle and low income classes work, thus strips away their value and with it any power to regulate. Those in the ivory towers have lost the touch to reality, are mad to consider that step and once they unleashed the spirit from the bottle, they will loose control and will not be able to avoid extinction. They cannot control the spirits they've raised. Greed is a sin, it will lead to the abyss. Turn, before it's too late!
youtube
AI Governance
2025-12-04T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzkaHi4i4WbTfYN07J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnJy42h39uZwlo5jB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxXqIvlEBfeXrykYBR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxwdT4i8HUSmgV_svd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzm0F6DA5rfGzctsDp4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxlGobmdj7hFgQcBDF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzC_ctZBKXWJZIxhW54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzh8pqiexJcFvZ72FN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxXDZ0Mb03n7h9LcJp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz5sO8zIEoVkp8POOx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]