Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Does this guy really want robots to be on the same level of humans? Do we really have to share? Why can't we be enough for ourselves? I don't want to be assisted by a robot instead of a person, I mean look at how it could change expression! And they will destroy humans one day, if we keep making them more and more intelligent. But we might deserve it in the end.
youtube AI Moral Status 2018-07-25T18:3…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugy1nguXvHEeTlkXYt94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxjyQ5lgGAT0nqi4hR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz_q1lXggnMTTEsCoF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxihSLgn48W_P_TFIF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"disapproval"}, {"id":"ytc_UgxDFhB1OWd9rjmZ2il4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwaH8cTaMI3KUdXSh14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"disapproval"}, {"id":"ytc_UgyyjYkSrtCXO1R4_U94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzrGfPL75fAZ1TAQ_x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzyhneG2gglcvPj2BJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwK97r4jBmYaQXVcMd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]