Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think the question that many people aren't really talking about is if they use ai to make art and stuff. What's next for us? Are we just not going to do anything because we have ai to do it for us? Like would we just live in a society where we would be like "hey guys disney just dropped frozen 9758"? Like what would there be to do once everything gets automated and saturated into oblivion? How would anyone get feedback if everything online is ai generated? Idk, that doesn't sound like progress to me, that just sounds like removing the need and practically the want to do anything. And no, not everyone is going to pack it up and lock into a stem field, so it makes it hard for me, who is getting back into creating to see a future where what I make is appreciated, as likely as soon as I post one thing a million copies of it would be made in a "better" fashion for free. And I know I'm probably not gonna be fighting a million legal battles to save it. Like... why are we trying to replace our own creativity with robots so badly? This just drives us further apart from each other. Its real sad to think about. Like ai is trying to wedge itself in every aspect of our life. What's next? It's gonna start giving me text summaries of my own thoughts and emotions since I'm too lazy to evaluate them myself or smth? We're finna be living like people from wall E 😭😭.
youtube Viral AI Reaction 2025-03-15T06:0… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugxtw-nLb_XzGPzQZfV4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxszZklRvy0tj-_3DN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzFdkuMi442HHbQTOZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy7ed9lorxfwqb1rjp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxBevmRhBEM51nvPZl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw3xyW3TFODLTk7lyt4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzKg5eGhfD-hxwm9SR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgygijgGiktm8BXnlgl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyn9zd6RUxLH8dDSYp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyHX9nyoYT9zShKpxB4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"resignation"} ]