Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI doesn't need self-awareness or personal experience to have a point of view though. It can have completely functional point of view, and this would be byproduct of its world model. It wouldn't be pretending to have point of view at that point. If its world model is coherent it wouldn't be much different from human, other than having unique look on life and crazy computational power to back it up.
youtube 2025-08-24T18:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwEjEo8VjPQyXOv0rJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwF36a2Ptkt-ZmB_L14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxGfKqJXmV7-N6FVpB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwjyAppkeE5zMdMTTp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy151dXf31yw-sCX5x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyO132qb-j5Us5h_wt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyTrnarDnvZ_5Slgg54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzjVraKTbfNkUNMP614AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy-tg4OtF3vy-wF0bJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugzb06eDrSLT3HsKEP14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]