Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There's no accountability in this technology. A human driver wouldn't do this because it is a crime. It is basically kidnapping or false imprisonment. But when the car does it, I guess it's okay? It's just a funny bug? This isn't a good direction for society.
youtube AI Harm Incident 2025-12-24T02:3…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugy4dw_LzEYjDaszAy94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzu99oWvD-C4Yq-JrF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyBJR_QfAzOL68oTXh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxxEM58A-queajnWjZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwJQHlD2jo_QZ3wzi14AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugz39XFycav6_r-v-uN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwvHfdXK39vLgr6Y1J4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugyv24XLkX22KfQ9Iat4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyrbQE2UMmUG5CrWQt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzKr6LKdWFCJOKWwcp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"} ]