Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There's no "aligning" AI to start with considering humans in general aren't even aligned to each other.
youtube AI Moral Status 2025-11-05T11:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgwSAEAsiRw5R_WOT-x4AaABAg.AOx7cAwUPknAP8a0Z56xsV","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgxvA05Dofqg6BN6yrx4AaABAg.AOx72P8RWfQAOzw_J-R3Gg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytr_Ugz63iM0nEp0kNBGq394AaABAg.AOx3lWrfMggAOxAr1ICdgS","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytr_Ugz63iM0nEp0kNBGq394AaABAg.AOx3lWrfMggAOxFoyRVEmE","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugw3cpgd_znRR5DCQ5h4AaABAg.AOx2z26DBfiAOxLcBRRF3x","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytr_UgwhN7AlDS6bIJ4PAGh4AaABAg.AOx2mxjkatoAOx2wJuTJ38","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"}, {"id":"ytr_UgyrrY5B1-Vdt8F-sfV4AaABAg.AOx-NFMBIx9AOxJoKVWU60","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgyrrY5B1-Vdt8F-sfV4AaABAg.AOx-NFMBIx9AOxZjZi73J0","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgxtjMfkynSd-a6jhNR4AaABAg.AOwzJgwRvf2AOye4qCao6i","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgxtjMfkynSd-a6jhNR4AaABAg.AOwzJgwRvf2AOyqDROFyLu","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"} ]