Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think you were a little bit too simplistic about the Luddites. “For the great good” was not why they made those machines. The moral good results was not planned… it’s a happy accident. it’s a perfect example within a capitalist market, labor (ie people) are extendable tossed for profit. Workers could be included in the improvements but that’s never the results (ie why the skilled labors destroyed the machines after being Fired) That’s the entire lie of “unskilled” labor!!! No one is unskilled in their labor. technology soothes this conflict oh but they are because a machine can do it instead..: hiding behind cheap results masking the crumbling reality. people who are against “technologies” for the sake of their existence are not stupid monkeys but rational actor. “It was better because everyone could get cheap cloth” was not going to be a known result… 😅 the immediate results for the luddites was a loss of livelihood and jobs. The destruction of centuries of production and craft. We keep doing this lie btw. That automation will help “somebody”. Who is this somebody who will get a job?. I’ve never met them. But I’ve sure met the people who lost their jobs, careers and livelihoods to automation for the profit line… 😅
youtube Viral AI Reaction 2025-09-02T04:0…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyniZ5lWx5-lhAeR6l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwBTc2yIahuv7IhweV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxJzoqS0h2TFSLYkj94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugzp9Q28Hnunq7LjUs54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy4L04UsjhUdVeDJPt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyZ6EP-2ZS-wactyLJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxzGGJ0mzI47KKXCZJ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzVkKTbcuS4e6tkvtZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxlN3XxYQoMzqQLP-p4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwObzubD6FKCEE2ZFV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]