Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There are books on how to structure a joke. Commedians do know and craft this stuff. I think side stepping human conciousness and psychology is leaving out a huge important part. Also I think a lot of the restraints of understanding AI are not intelligence but time. And thats the same with humans if you knew their story and their culture and their families story you could understand them but you dont have to. You can indeed just say stuff. We act like a psychologist in a mental hospital in the 80s trying to objectivly determening if a mental patient is sane. Thats why we shift the criteria to, functonality and patient well being. That leaves the question if we care if AI sais it feels good about itself or not. Same with humans.. I also love how he talks abou AI psychosis and as soon as the AI agrees with HIS particular worries hes like oh thats amazing. No dude thats the same fucking thing like understamfing dark matter in a different skin. AI is not the problem here the problem is brains. We want truth without a goal but thats impossible. We need to start reintroducing the object back into our language. Things are useful for x. Or helpful for z. Nothing is universally useful or intelligent or even funny. Thats looking for omnipotence, thats psychosis my friends!
youtube AI Moral Status 2025-12-05T09:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxQtfQccEd6wNZMJod4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzC3hjBhUyU0PlGd2B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxji58LJykrzd0KVip4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzzgAGML7mk2Tgao9R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwLp1OM9DGWXQvgxCR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwU9XaDkAC4DPouC4d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugyg6EFuaZ7tjPIrg5d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxmLpVDOqoFYB2V6h94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_Ugxz9pT9Iu8JZlGhd354AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwTQ2SHbyUoWMyXRtl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]