Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Why is 60Mins referring to incidents that occurred *6 years* (or even longer) ago and which involves an entirely different system (hardware and software) to what FSD *today* is using? In reality, you have only to look at the record of FSD use in cybercabs operating on the streets of cities in the USA today to see that this technology 'works' (ie it causes less collisions that humans do). Is it perfect? No. Will it ever be so? No. Anyone who thinks that nothing less than this will make for an acceptable autonomous vehicle is a cretin. It only has to be a bit better than the (frankly, pretty awful) average human driver to start reducing the 1.2 *million* deaths that occur on the roads of our planet *EVERY YEAR*. I am pretty disappointed at 60Mins' narrow-minded, necessarily hysterical and very misleading take on this incredibly important technology and I suspect the attitudes and opinion of Tara Brown and Dr Cummings will not age well.
youtube AI Harm Incident 2025-11-04T22:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgwEsJDUid0UEZfG5q14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyhLhyvJv0pvbG_1ch4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxXrs86U4Od9huckHV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy3SeLEcZq3fYJY5914AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwG7lLPDd8pjU4odBx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzccjQoEiJHLn-pTj54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}, {"id":"ytc_UgyUiIvVgcgyDk7nUp14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwPcKFBfxoHtFktAtd4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgzdkFwAPdnEEajdAZZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugwv7C45-tWaBNvBkR94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"} ]