Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
First problem is expecting a human driver to take over in time. Automation that can seem like automatic driving at a minimum promotes inattention as it is doing the majority of the work. People will be much slower to act when there is a need and the realization of that need will likely be too late.  There are problems caused by aircraft automation. In some ways it helps pilots but they can develop dependency and get specific training on a regular basis to deal with these issues. Drivers do not get such training. At worse, such automation enables total distraction like paying attention to one's phone.   Disengaging is also the completely wrong response at such a late moment. If the automation is going to take over, it needs to try and avoid. I also note that car was staying in the lane next to the vehicles on the side of the road. By law in most places one has to move over a lane. That was perhaps the early warning to the driver to take over, but too late. The system is no good if it doesn't follow some traffic basics like that. Personally automatic driving seems more of a pipe dream until the roads are specifically constructed for automatic driving and only automated vehicles are allowed.
youtube AI Harm Incident 2025-02-09T00:0… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyAFQBeSTNdDFCNLCd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzFJUGkjhTnVZaO2cd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugz8Kq0kkDafXWeuKAZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxyJw0lBpwlB8t0uM94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx6BdvFC32STL9goAp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyVwdsHIRRfPHIeXlh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwisFB6Iwt5bwYPyDh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwFsr2csxlbD3r8UiN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgypHeJg7YS7YKruSSd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugxh50gLsl7n_pjy-594AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]