Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think this is a similar problem to determining if animals are people: they're beings unlike us, so we can't tell they're people just by figuring they must have personhood for the same reason we do (our own personhood being assumed). Unlike animals though, AI have different origins but can also communicate in our own language. I would say an AI was a person if it independently decided to do things that exceeded its programming, and also claimed personhood.
youtube 2018-01-07T09:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningcontractualist
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwnoaUFsCXOzkcq_ZV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzjsTx15lmTPOSXuNp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugz3Hd6Yy7OzEokMsh54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyfJjctTYtUTQ1NYC54AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgyBGWuboWU8p8j1MLt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyMR_ops1KPF2jmqGZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyX53NRMtwP-25lMZ94AaABAg","responsibility":"none","reasoning":"contractualist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugx7Xc_2p8kHoVXJdNF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxiPPM064mTrFy7cWl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyYvb7u2H_-hs16vbd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"mixed"} ]