Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think the supposed psychosis from using LLMs, is just people trying to find a way to have loved ones deemed insane because their loved ones are spewing logically sound arguments they don't agree with. People don't like that there is totally sound logic for positions they don't agree with and they don't like that LLMs are helping people have logical arguments.
youtube AI Moral Status 2025-07-09T22:2…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwuGwgWZIl-1GBjG5J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyT0Om26CMBvV_KjJh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz2jhnmnw-Sca_tkVd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugz2Wyk8HDXKUBUbH4F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugw6Chc9PjEF-5n9VaF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxOuaG2WfRrMeBrHuF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwNDQyCTodEJs0m1lR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzdD5iWCu01WrpRlZ54AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugwx2FA7adX3BhwMwRB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyIoYIsnEU8EuDbdrV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"} ]