Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
FSD, at least for the last 12 to 18 months, when properly supervised by a competent driver is many times safer than most human drivers. Period. The unsafe operation of any car can result in injury and death. Period. With the advent of version 14.2.1 of FSD around Thanksgiving, it seems to me that FSD alone, even without supervision, drives like a patient, courteous, confident and attentive human driver, probably safer than most human drivers even were it not being supervised. Anything you are paid with which to pad your shyster wallet that delays Tesla bringing more of this quality of autonomy and safety to American roads is blood money. And btw, is that the Florida case where a fellow was digging around on the floorboard to get the phone he dropped, while he had Autosteer (which was not FSD) holding his car in the lane and a pedestrian was killed? He was not properly operating the vehicle, of course bad things happened. It was sad. Tragic. Human. It is with the hope of eliminating such human error on our roads that Tesla and other companies are attempting develop and deploy autonomous driving.
youtube AI Harm Incident 2025-12-12T03:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyhSztiv_TZd8uEF-54AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzBoxYsnuaGB591nnJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy6-JWMgh2ppA5iRkp4AaABAg","responsibility":"government","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugx17gZzW9FDASpf7bJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgxnfmUUKxNGe89WeWN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugzl60eMyvccDhvnZ0J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgydO5vV4t_8xi3GQ-B4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytc_UgzqkAHOZVQtHRWETu14AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzZ9gg-G7XknLeq6iF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgzCkxR1f3mz0QAMItN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"} ]