Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You assume that AI will have the same kind of consciousness a human would have. That's like trying to say plant feel the same pain human feels. It would be best if you didn't get into the habit of using human standards to measure something that is entirely different from a human. New standards will need to be in place when you're trying to discover something new. Thinking outside the box is basically the key to this.
youtube AI Moral Status 2023-07-07T08:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzYaEAG7scD87uNGG54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugx0sQCDwXCZwmPtysp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzLM6mdrBW7r3svbf14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwgAvs-ukMlH-U3C8B4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzbWTcEtt-tYE_Q7ep4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy1eKk-zgYwkXPzbVN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyUYmOViLDl6xsKnQp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgwoznhfANkeEUtSIeN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwdfYsHbAhtDduuGdl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyqK9hgG_fWdbuFkHd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"} ]