Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
That's why you need to train AI models with morals to preserve human lives over their own if they don't have a choice. They're effectively toddlers, and they need to go to school to learn morals and stuff, just like humans do.
youtube AI Harm Incident 2025-07-25T03:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyregulate
Emotionapproval
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxdE3l-fDH2axgoFmh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw3wirU5ozAvxQYD4J4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugxvv9jlkHrCVeg9sEh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy8e9PVCp_04GhuN8R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugzr1mlkani93OXomrd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxCIBNYZNRJHm1xcA94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwPfjTLyqfT2iqXTP94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugzsl8r96ceOUZ2tY594AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyX-i_CHSRxcN5fUbZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwTY5ifZtprwsDWxmB4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"outrage"} ]