Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
As a rider with 50 years and 750k miles under my belt and a Tesla driver with 3 years and 45k miles under my belt I feel that I can approach this subject with more than a little objectivity. Anyone who has ridden much knows that car drivers are an enormous hazard. Between the increasing inattentiveness, ineptitude, the sense that they are invulnerable, and (sometimes) outright hostility, I've learned to simply not allow car drivers to kill me. So, the real question here is one of relative risks. It goes without saying that nothing is perfect, so if that's the standard then there will never be change. But if one accepts that what matters is relative risk, then one can make reasoned decisions. When I'm riding I simply won't allow 4 wheel vehicle drivers to get near me....certainly not follow closely. This has little to do with any concerns I might have about any automation in their vehicles, but instead about all the other factors listed above. So, as a rider, which is the greater risk? I firmly believe that the risks associated with vehicle automation are far more acceptable than the risks associated with the ambient incompetence out there. The sooner people no longer drive their 2+ ton rolling hazards the sooner those of us on 2 wheels will be safer. The thing about FSD is that unlike humans, it gets better over time. The defects that resulted in the deaths of these 2 bikers were likely fixed some time back. So, in this sense, the video is more or less irrelevant. But even if there is still some risk, I'd gladly accept that risk over what I've dealt with over the past 50 years. Discussions about the relative benefits of optical versus radar versus lidar are not terribly useful as none of us laypeople are adequately knowledgeable about the intricacies of how to make FSD work as to be able to render any meaningful judgments. While there is a tendency to discount accident statistics, the data is clear. The more automation in cars, the less they crash. No ifs, and, or buts. So, where does that leave us? While I enjoy FortNine content, I think in this case it's just more FUD. To frame the issue the way it's been framed is simply incorrect. What counts is relative risk, not hypothetical risk. I'm firmly convinced that the sooner folks give up pretending to drive the safer we will all be. Whether it's Tesla or anyone else, this needs to happen, and the longer we come up with obstacles to full implementation, the more motorcyclists who will needlessly die. As for your assertions regarding Tesla's motives, these are just opinions with little actual understanding of the technical underpinnings of making FSD work. If you truly want to help motorcyclists, then please support rather than oppose the development of FSD. FUD like this only perpetuates the danger that we motorcyclists face every day.
youtube AI Harm Incident 2022-09-06T16:2… ♥ 6
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugwijou4cZywxDHvkil4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzaaQAN2LOIMrtIFk94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzuFbBGTk91n5cRqvd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzcZC9tbBgwqoGqdMd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyALLyMAc_pY0R4nJh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugw044-5uXVIsrAOJzV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugyr7_qzNvmwE1Vu9Bp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"disapproval"}, {"id":"ytc_UgwObOlQynetUkHYq1Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzFF3KTuHnW0XX1o594AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwi5NF8At6afXzXbK14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"} ]