Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Kurt Vonnegut’s Player Piano (1952) imagines a future America where machines have replaced nearly all human labor. Only a small elite of engineers and managers design and oversee the automated system, while most people are left without meaningful work, shunted into military service or government jobs. The story follows Dr. Paul Proteus, a successful engineer who becomes disillusioned with the system’s emptiness and joins a doomed rebellion to restore human purpose. The novel critiques a society that values efficiency and productivity over human dignity, highlighting the alienation caused when technology strips life of meaning. Relevance to the AI Debate Today Vonnegut’s vision resonates powerfully in the age of artificial intelligence. Just as automation in Player Piano displaced human workers, AI is raising fears about large-scale job loss in both physical and white-collar professions. The novel’s stark division between a technological elite and a disenfranchised majority mirrors concerns that AI will concentrate power and wealth in the hands of a few companies and experts. At its core, the book asks the same questions we face now: What happens to identity and purpose when work is no longer central to human life? How do we ensure that technology serves humanity rather than the other way around? Vonnegut’s satire acts as both a warning and a reminder that progress without values risks hollowing out the very people it is meant to liberate.
youtube AI Jobs 2025-08-29T05:1…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzpPWHuBwtrK5Q885h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzsWNYJzeyBZNqsJyN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyJ6QXBlMUDa3T58sJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgxeWH4OSYdZ9fzzK5F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugyn9ykPlnRpeFNc3Zp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwWABZGAh3dGPZqq7t4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzMa8UTw_WGDJ1AnHp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}, {"id":"ytc_UgxahWsCECMivLQcMad4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxJwvMbyJEIXDJmwSF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz5rkr0aRHXc-e7H1d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]