Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Roman Yampolskiy says that 99% of jobs will be done by AI and humanoid robots, which sounds impressive… until you remember we live on a planet with some pretty inflexible physical limits. Are we really going to flood the world with androids without worrying about where the energy will come from, how we’ll cool the data centers, or how to mine the rare materials needed to build their titanium bones? In these futuristic visions, logistics, ecology, and physics seem to take a back seat, but maybe AGI will invent magical energy and infinite batteries too? And if by some technological miracle we do manage to create an intelligence that surpasses us in everything: from sweeping the floor to designing its own improved version, then it might ask a pretty logical question: why keep the humans around?
youtube AI Governance 2025-09-04T15:3…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningconsequentialist
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw5b3Hhb1nVyCTnh4t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgxstSypKtLzIKE40IV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxTwKnbM8krN4luia14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxMxi4MOUKum6vtQJ14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzhLMdV5vV8QNd0vuV4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwRkEdAWL4iQ7NTVpJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugzp5-Q1tm3c8k-ocBN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugwc5NgHOMDvWPICXcB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwAeeELcIBOGJ4c9Kp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyfiNQfYdYaPu4adkV4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"} ]