Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
But that's the thing, Dr. Tyson. History tends to gloss over a lot of things. Those people who've lost their jobs and businesses, not all of them were able to adapt and find something else that gave them the same level of economic stability and prosperity that they had before their lives were disrupted. I hazard most did not and they probably struggled until the day they died. We know better now. We should know better. We cannot let this happen to humanity, especially our creatives. Letting AI and those who are pushing it down our throats, win and normalize AI as it is right now, where we are forced to subscribe and pay big tech for the very thing that will harm other people, then we are going to own even less of the world that we own now. Neil's idea of the future expects that everyone can stay ahead of AI by doing things it cannot do, but nothing is stopping big tech from stealing new ideas and feeding it to their AIs and where will that leave us then?
youtube AI Moral Status 2025-08-10T19:0…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwSyrmk1Hv14k0dap54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzvCsfxkSjYgygfH2B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwVJPKbIJUYUf3I4Ol4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgygHJGOkrLuBDwdzrN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxMfUj3wJg5MszdQnl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwWO_aFhK3d-eIwsH14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx8p7YGjp-2o9OiJK54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxbB1KgFpmwm71mont4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx3k-4zgcoM-5qfLvh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugz1BIulg81BS8nXNet4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"} ]