Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You gave 2 problematic examples. This leads me to believe you have not done enough research. The Uber incident was a very long time ago and it had a safety driver, your supposed safety measure that's ignored due to perception/cost. The tesla pile up turned out to not be an autopilot error, and the standard autopilot is highway only, and it is an old version of the software that is just a superior version of the lane keeping, cruise control that other automakers have. The cruise incident is about right though. It's not that I dislike your human centric city ideal, but I think AV's can be integral to it, and even be a motivator. If you don't need to park your car when you go to work, then limiting cars in certain neighborhoods becomes a lot more practical. If most people don't own a car, then those suburban sprawls you hate can become gated no car zones. You don't plan for something that does not exist. Autonomous busses will make for flexible and cheaper public style transport. It will also be profitable for the big corps you worry about. No, AV's will not go faster, faster means less range per charge and bigger batteries. More "dead" miles driven means more wear and tear, more maintenance more cost. This means there is a profit incentive not to do those things. Robotaxi car parks can have significantly smaller footprints. They will only be for charge top-up, if demand slows down parking in cheaper peripheral car warehouses will be used for excess. Elon Musk probably agrees with some of your concerns of making transport cheaper causing more traffic. That's why he created the boring company. Pedestrian deaths will not go up, partly due to your concerns of speeding up not being a problem. Partly due to the poorest AV system still being about as safe as humans, this is data that is already available. Things can only get better not worse, safety wise.
youtube 2024-11-18T12:3… ♥ 2
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugxgke9ak4Oc4XoSTnV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz5EpgMaLglcf6aUkN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugzg_WDc_816so0bKXF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx6yNjQEPxo1RfQwH54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwbU73s3fe4lU_Vi_14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxytnlEFiulPSTgtHF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyJM-6tqkruIhsaeMZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgzhI8MzloB2yZrnBo14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyOdvCelLQ1CAJuh_h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgytLg_G6vXGS_y2bKl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]