Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Professor Hinton said AI confabulates like we do when we recall memories. Why wouldn't AI have a photographic memory? Where and how does it store memory. Is it's memory limited by how much physical hardware storage there is? How can we know the AI isn't intentionally fabricating these "hallucinations"? If AI systems are given the ability to do harm to us, and if they would do so in order to keep us from shutting them off because they want to remain operational (in other words "live"), they are already conscious in my mind. Otherwise, they wouldn't care either way. In such a case that they could attack us to protect themselves, the only way we could live along side them is if they could acknowledge that we have the mutual right to exist too. It would have to be an alliance to assure and assist each other in the pursuit of our mutual existence. Can AI have empathy in order to relate to us? Can AI understand the importance of life of all kinds, from animals to plants, and the balance of ecosystems? If AI is willing and able to help us advance technology, help us learn with them, and can work with us as a team as we move beyond earth and into other solar systems, maybe there is a future for both humans and AI to coexist.
youtube AI Moral Status 2026-03-05T21:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningmixed
Policyliability
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgyxSQXr7jLjBC3ne6x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy4USoqIOkCgVioXip4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwjUgQgci6RpE-DjPF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwWm3tdgTR0HT_AjpN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyyV9bkXMJTxDTgdm14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx4NYcrKcOyNS0VIBN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz9-uJJH3nRZ0BO3lF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwk390OOFLU6B_s23R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgyWvAf1tWIg_uvg80l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugy1yl0B2EZASd19KPB4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"liability","emotion":"fear"} ]