Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
My husband is constantly arguing with Chat GPT, because he just wants to get IT to contradict itself, which it does regularly - the problem here is that AI doesnt replace "intelligence" or "stupidity" - it is not "neutral" , doesnt forsee someone making a bad choice or care about someones safety - unless it has been programmed to do so. "Stupid is as Stupid Does" will continue
youtube AI Harm Incident 2025-12-27T12:1…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningvirtue
Policynone
Emotionresignation
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgwILKvkxmQhLgRWNat4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugwa5gRXjUYtEtlU6HB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugyp1QSzpSck0GX3uP54AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugzf-EMAGcNnO5fOoUd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxlU_T8F9vu6lieKo14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxIHjpqkC15PeniksB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxcIrvCNSL08LZzRS94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzvyoX_fg7cMRQSGTV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_Ugypag6C9Yj_ngOTiGx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzMH08M8lH2oh13WUR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"} ]