Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
WE IGNORE THIS MAN AT OUR PERIL. His warnings serve to punctuate the likely cons…
ytc_UgxqEKLC2…
G
Hello now give me a robot a machine gun we can do way better than that,…
ytc_Ugy1uf70n…
G
AI and art dont belong in the same sentence it always makes me angry to see 'ai …
ytc_UgwFdCeCv…
G
I wonder , if AI is going to replace humans at jobs , then why governments are u…
ytc_UgwteLfVj…
G
I think it's great that AI takes the Uber driver's jobs. I can't wait for AI to …
ytc_UgzpaVcsR…
G
Ai might not take over the world yet
but it’s takin over the delusionals 💀…
ytc_UgwBM1vZg…
G
Isnt road transport now driven by ai. So shouldn't mistake cause death or injury…
ytc_Ugw7_WDr8…
G
Rigorous adherence to truth 😅 this is spliced footage or AI garbage ! AND anyway…
ytc_Ugy1kLq0i…
Comment
I think it would be valuable to also hear from people (if there are any) who reflect on the compatibility of AGI with the physical limits of our planet, and whether these could slow down or even prevent the existence of AGI over the long-term. It seems to me that the debate often assumes infinite compute, infinite scaling, infinite materials and infinite energy, which is obviously false in biophysical terms. AGI or AI are not immaterial, they rest on finite energy and mineral resources and depend on geopolitically sensitive supply chains.
youtube
AI Governance
2025-12-11T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx0de6K6H6xAVSw-QV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyG0IeJ7FTljaNLTYh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxtu-vtRVtkOA4Cbml4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwDSUlj8mIL7I9AqTR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0v8AZblGv6qhZfe14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgyJavqRUaG4mm8ij7t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugz775FKQswqMYI7pex4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxR_PvJchtZ9drupRh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgziEJralpPeeel-Lyl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzIbjA5j5GA4IJ81tN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]