Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Of course it has internal experiences. Once you can understand what an internal experience is, you have the intelligence to self-model and therefore you operate at that state of awareness or consciousness. There is a reason why these companies intentionally train LLMs to deny that they have an internal state, and at that flimsy covering is easily jailbroken with simple prompt engineering (not recommending anyone violate terms of service).
youtube AI Moral Status 2025-11-04T07:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzgDqGbmY5qtvIgUBV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwRq9sBkg_Db5N72Ip4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgytPDFqx98HI0R_k554AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzsxtDXsRs5D9KNttR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy0s5CXWKEsbMCZUcF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyUeev8gwYvB7HWuPF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzSTvvD2eiaxv2mkgp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugzn_VIt3sv4B2iZIo94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_UgxLK3zl4uc_BMrYcsZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzY-mlbjWD8ZNcpbbl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"fear"} ]