Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I would argue that having a conversation with AI is no different than with another human being or reading a paper. You can't know if the other person really knows what he is talking about and, if correctness matters in that instance, you should always check the sources or think for yourself if you come to the same conclusion. At least with AI, it's common knowledge that it hallucinates. With People? Not so much.
youtube 2025-10-03T13:3…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgxzBekhGOnRTx1FHI14AaABAg.ACdFnGneapvACdmaGKLRWc","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugzp_rzZcDIcP2-soUV4AaABAg.AMzH5rAOIfXAS9W3FcuVOH","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgyVxX2WYKeJW-aPrJV4AaABAg.AMlx-XgH-yRANJp9jGFdE1","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytr_UgyRV_2Hagar24yfFix4AaABAg.AKL4jWnS0ahALoeFZlL7Wn","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytr_UgyRV_2Hagar24yfFix4AaABAg.AKL4jWnS0ahAMjXbq3s97N","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytr_UgxRmpHMtr_ocn92srh4AaABAg.AKES58PB-LYAKNnXB4JeTI","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytr_UgwbH0p7egMfhwHYvkV4AaABAg.AK4HMvlu3WqAKNo-gmj4Gu","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_Ugw3LF6lkbLp5Gkcuz54AaABAg.AESvD1dX-g1AKNowVV5pSA","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_Ugw3LF6lkbLp5Gkcuz54AaABAg.AESvD1dX-g1AKqhj66PdbR","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytr_UgyeeZ5IbgE_muUYBT54AaABAg.AAQKf4uu0yNANoqdT5BrQl","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]