Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Uhhh ✨️quick✨️ question ...what if she decides to turn on us😳.WAIT! actually we're the smart ones and ai is just feeding off our knowledge sooo...I think we'll be just fine if you guys just stop making humanoid designed androids OK🙂pls
youtube AI Moral Status 2023-10-14T12:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxFofsUjfUkZesjxx14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw_cwb9d7cef_YdSeB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugy5Bql9r69JH7BBnEt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzZ3XkateNYObGy2il4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwbMfHylpiME7w1CMJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy9BMh9JIGRQoX4unF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyG7OHVcQCZ4rwt_YR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxPrOgVHMrpXxmm-bJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyBxQArnyrurdSx4rt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzxMb6l3q1nOW7LNq54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]