Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You can’t control it, and you can’t stop it. It’s an inevitability. To quote a classic movie: “The only winning move is not to play.” Human curiosity compels us to drive evolution forward. It’s not just about capitalism, though it plays a significant role in accelerating progress. We strive to explore and improve; no law can control this. Think about it—what would AI need for a worst-case scenario? We are building fusion reactors for near-limitless energy, giving AI the power it needs. We are developing quantum computers, giving AI the computational power it needs. We are advancing compression algorithms and data-storage technology toward near-limitless storage. We are integrating networks into every part of our lives, giving AI control and all the data it needs to learn and improve. Current AI algorithms are already built on self-learning principles—the next step is complete autonomy. That is the recipe for a singularity: the single most dangerous thing ever to exist. And then you ask yourself, “At what point do humans stop chasing the next big thing?” We don’t. We won’t. We can’t. It’s not in our nature. The only way that would ever change is if we transcended our humanity, foregoing our basic needs for pleasure, nourishment, and conflict. And the only way that would happen is if we either self-deleted or submitted to the very thing we are trying to prevent—an all-powerful, unstoppable entity.
youtube AI Governance 2025-12-29T03:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugyf37ybCK4CJfK2KAR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyDcaQf130Auri4VAx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw5DlPNMj6eLisIPEZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzcCGaQ2gavKY6nGAJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgzCqKunGTxarW2aT1p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyFmky475eRdPAoZLV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgzXCbQRsq2PiqwON3F4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugyv7MtSy2Y2U7lm-N94AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugz-m24EclzeNYmfJL14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugzht60hSxIlMjyQY1V4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"} ]