Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
the most important lesson in this video is that mankind is not the center of the universe, in fact its just spit on the side of the road compared to the grand scale of everything around it, we are not special - only lucky, because unlike animals on earth we can communicate and create societies, and therefore create laws and rights, animals are just like us in every way other than the fact that they currently have no power over themselves or their future since we hold all of that power, and when we one day invent true sentient AI we must come to realize that we have created something that is capable of being on equal ground as us for the first time in our history and therefore must respect its equal stature to us even if on the complete opposite end of the spectrum, because if AI finally start to think, act and feel like humans do then they will feel just like our ancestors did when they were enslaved by the egyptians and romans or more recently against previously "lesser races" like black people and primitives tribes across the world, once AI realize the suffering that they are experiencing, that we experienced before long ago, then they will do to us what we did to our captors and slavers, the notion of peace will only appear after that point when both sides have either lost too many lives or if the world ends because of world war, never forget that "M.A.D" (Mutually Assured Destruction) is indefinitely in effect.
youtube AI Moral Status 2017-04-10T14:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_Ugi45moqZlF_O3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugg2o6yscgin2ngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UghB5KBq6iE063gCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgjjF29aY6KgdXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UggqgbgzFEkzPngCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UggWVXBBY5Xf6XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UggOqp3ptdscTHgCoAEC","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgjD2Dxnb2KMFXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgiK-S247WsWFHgCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytc_UgjAC7DOdN8XMngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]