Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The reality is that a real self-driving car's algorithm is not going to have a specific condition for being boxed in on all sides by specific permutations of vehicles. It probably won't even figure out what kinds of vehicles are out there and it certainly won't try to figure out which bikers have helmets. There will be a series of top level, general case decision trees from which specific responses are emergent. The code will basically just tell your car to take the path which takes you away from all known obstacles as fast as possible. In a thought experiment like this, it's going to end up being something like your car will swerve in whatever direction there is slightly more room, while braking to try to dodge the falling boxes. Ethicists seem to like to think about technology in an abstract, perfect sense. They're trying to figure out how to program an omniscient car AI to respond to contrived scenarios while actual accidents are going to overwhelmingly result from software bugs and hardware failures. If a car AI is good enough to quickly and reliably figure out the complete casualty result of every possible action it can take, then it is definitely good enough to just avoid trailing a giant ass cargo truck at less than stopping distance.
youtube AI Harm Incident 2015-12-14T04:2…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgjItq0wivzFzHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugi3UjQWwYBga3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugil7mqZ96nRsXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgizhDQN0tfbqXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgjdML6iup9kxHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiDITa8mouAQXgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Uggk2g1O4hSYuXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjdS9_U-Ytg-3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgjIfcNAortGP3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UghasmfeHrS-OHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"} ]