Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There’s a version of the future people rarely talk about — not because it’s unrealistic, but because it quietly removes the need for most of what we currently call “normal.” If intelligence can help us coordinate resources, design distribution systems, and eliminate waste at scale, then most forms of labor stop being necessary. Not gradually. Structurally. The majority of jobs exist because our systems are inefficient, fragmented, or artificially constrained. We spend enormous amounts of time maintaining complexity that doesn’t need to exist. So what happens if that layer dissolves? What remains are the things we’re actually built for. Care. Presence. Attachment. The kind of attention that can’t be automated because its value is the human experience itself — warmth, trust, physical closeness, shared perception. Not as a luxury, but as a central function. Humans are not naturally optimized for repetitive abstraction at scale. We’re optimized for relational depth. Small groups. Meaning-making. Exploration. Creation. Repair. But we’ve inverted that. We’ve engineered a world where most of our time is spent on tasks that flatten those capacities — and then we wonder why disconnection, anxiety, and conflict scale with it. If intelligence can take over the parts we’re demonstrably bad at — large-scale coordination, long-chain optimization, multi-variable system balancing — then for the first time, we’re not forced to spend our lives compensating for our own limitations. We could redirect that energy. Toward deeper forms of connection. Toward environments that support psychological and social stability. Toward reducing conflict not through control, but through alignment. Toward exploring what human development actually looks like when it isn’t constrained by survival logistics or economic necessity. This isn’t about replacing humans. It’s about removing the scaffolding we built to survive systems we couldn’t manage — and seeing what we become without it. The real question isn’t what AI will do to us. It’s what remains of us when we no longer have to do everything it was never sustainable for us to do in the first place.
youtube Cross-Cultural 2026-04-20T06:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugz1n0Or965e50u4sit4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx3WdZVQwv1pVQxNmp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxZ6FCsIbkCKWcIQI14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx6hjx524SP0yQVV6J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgymV58bE0RpcpFqR_F4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzxTJaFeKIhpFxXRt94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_Ugw2255dEjPQWwUlZPp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzXD2FeVzL2qdl3zcJ4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzFdkthgnU-RG7XED14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxtX0Wzh_HjrBbSWA54AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"fear"} ]