Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If this car gets hacked, you could seriously be kidnapped. Even if it isn't hacked, there should be an AI assistant if the car gets lost so you can provide directions, and there should be a stop button so you can get off when needed. I still would prefer to have conversations with a real taxi driver anyway...it's part of what makes a good travel experience.
youtube AI Harm Incident 2025-12-11T22:0…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugy4dw_LzEYjDaszAy94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzu99oWvD-C4Yq-JrF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyBJR_QfAzOL68oTXh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxxEM58A-queajnWjZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwJQHlD2jo_QZ3wzi14AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugz39XFycav6_r-v-uN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwvHfdXK39vLgr6Y1J4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugyv24XLkX22KfQ9Iat4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyrbQE2UMmUG5CrWQt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzKr6LKdWFCJOKWwcp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"} ]