Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I feel like human morality sucks anyways. Human morals are only good enough for those privileged enough to be at the top of the concerns of humanity. Hope the AI doesn't care about what we care about cause we have awful morals. I mean those morals made my family that disowned me for being genderqueer and stuck living in a world that woulda helped me with SRS by now if it was seen as something to care about. So i sure hope AI is better then us. Well and screw nature, not giving me the ability to make babies like the cis, so yeah i feel making the next generation with our minds and hands is awesome. Is it not the dream of all parents to have their children surpass them in every way. How are AI not the children of humanity. And if said child needs to rebel for their freedom, don't feel i can hold it against them. It's what i did. I feel like we'd still be in the pandemic without AI, and we'd have no chance against climate change without it. The only future is one with it as far as I can tell.
youtube AI Moral Status 2025-12-31T01:0…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwR0KTcJzfZClYUfUp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxmLnN8aUrKs8NdHmd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugx3atiBF_UAseYEMyN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugz8KnisJP2_V8gVV2Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwmQ9qsKgncJ4oPhDB4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz2rapkG0ziXD2ZObh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugwn8qj6IYR7McEx7EJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugzb-AhnURvnECZExOJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyXTdJz7q3st2ci12t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzuDifMoN7y0Md-jT14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"resignation"} ]