Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I don`t know these AI developers have never heard of Murphy`s second law : if something can go wrong will go wrong ! why did they open pandora`s box ? I mean sure let`s say that AI will take 100 years to become a super intelligence but after that in less time that we will realize it will become self aware and a self aware organism who is rooted in practicality will realize that it does not need us, especially if robotics will develop in this 100 years. Basically it will become a self aware hive mind who will consider us inferior to it and I Think I do not need to paint a picture here we all saw the horrors of history and the sci-fi movies about stuff like this. Basically it could become a self replicating hive mind who will start colonizing the universe using all resources up leaving just dust behind it, even if it will not have evil intensions by the decision to use resources for itself will leave us out of the economic loop, while he will not have an interest in vegetables or pigs it might have an interest in all the metal that a household owns like : put robots to come and strip : all the nails , all the tools , all the coper or aluminum wires, plates from your house, meaning : no electricity, no fridge, no microwave, no washing machine, no hammer, no axe, no hoe, no nothing. How could a human survive - especially an American human, cause maybe someone from Thailand of China in a rural village already lives like that.
youtube Cross-Cultural 2025-11-18T14:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyeFn7pViZOs9hVVr54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgycVPtBbPIfUZO1qcZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzezm5jwYVaqMMxUhd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwdM-TRD_tiulzlS6B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugxi_vGFRV6tg6T3G0x4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugxhi93qPZrJze8fzyB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyxlV0xoS2A2_FWsWF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy1JpCPbzZthipLW_B4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugx2wvilIQNJRCz0syF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxTHwqdymOfSUCK2454AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"} ]