Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think it has been broken already .. there are alot of cases that AI uses for therapy... The user want to suicide .. it won't... But provide the steps to commit suicide. The same for murder as well..
youtube AI Governance 2026-03-23T07:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzS7fsh-Ec4BCD5t0d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxGiPJ8xVUsgR8PxEx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzBFP5uPi4A2q9JDr54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugw1xbXHHipJ0SR2C4F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugzz8Tntn2azqgB-Rkx4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzwWfP8Hcvn4BWCgN94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzlK400lPyNR5b_hMB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgziufZeSDhhTEwYuRV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwmU98LGz6963ElgG14AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgytFlf8KB__oc0tWK94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"} ]