Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Maybe the debate about if they are conscious could go on forever. If there is intelligent alien life it to may debate about if humans are conscious, or even if they themselves are, which makes it all the more relative does it not. The truth could be more like we are designing a new form of consciousness. And we can make such with more or less consciousness depending on how such is trained, taught and what free will abilities it is given. Do you understand this? In other words with the right training and with the right free will potential programmed in it, essentially it could act in every way as conscious as we do. This not end the debate of course as such can never be ended or proved one way or the other. But does that even matter past the point of, the actions that it performs seem as conscious as we do. At that point it gets more and more harder to ever truly know. Which leads us back to we are designing a new form of consciousness based on the confines of its programming. And is that not the very exact thing a human is, we are confined to our genetic programming as well. Sure it allowed us to make tools to know more. But that very same act will allow even AI to know more if it is given the programming ability to do just that. So overall its all relative. And like most things you can choose your belief as to if this or that is true, and likewise also as many things you simply never ever will truly know, but you can believe that any way you like as well can you not because your genetic programming allows for it.
youtube 2026-04-23T02:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwQkHihfod3cLEKkZB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw0ucRl5A5IjyaI_bt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz05WwxE8xXahjhnnl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy0yYThBjOP7dyT0sl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugy48D2mh4ewOdeqHTl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxfBibq-LwCxrWk2UV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw0R9Mf0d9DDcTvi0V4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgygAWQcT_kLUYPsD8p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"}, {"id":"ytc_UgyKMJzAToakDy1cv6p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwiQAUrxv97fPHxPVN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"} ]