Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is what the back store to Dun is about man figuring out AI is not controlla…
ytc_Ugwxipv7C…
G
Hope you say that when you get replaced by AI, which will eventually happen to e…
rdc_lv8cgsc
G
This is the biggest unlock for Microsoft! Their investment in OpenAI enabled the…
rdc_oh6g1av
G
The question is not whether or not it's ethical to stop the development of A.I. …
ytc_Ugh1jhhjo…
G
Ai will likely realize we have something special in us and want to merge...cybor…
ytr_UgxX3iA24…
G
Why? What do robots/AI have to gain by conquering humanity and the world?
Logica…
ytr_UgyvkU837…
G
... I have a feeling im going to be in the minority... but I think this is a goo…
ytc_UgzkHB_7b…
G
plot twist: the robot is only gonna destroy a European guy she met one time name…
ytc_Ugz_3shDc…
Comment
Why is 60Mins referring to incidents that occurred *6 years* (or even longer) ago and which involves an entirely different system (hardware and software) to what FSD *today* is using?
In reality, you have only to look at the record of FSD use in cybercabs operating on the streets of cities in the USA today to see that this technology 'works' (ie it causes less collisions that humans do). Is it perfect? No. Will it ever be so? No. Anyone who thinks that nothing less than this will make for an acceptable autonomous vehicle is a cretin. It only has to be a bit better than the (frankly, pretty awful) average human driver to start reducing the 1.2 *million* deaths that occur on the roads of our planet *EVERY YEAR*.
I am pretty disappointed at 60Mins' narrow-minded, necessarily hysterical and very misleading take on this incredibly important technology and I suspect the attitudes and opinion of Tara Brown and Dr Cummings will not age well.
youtube
AI Harm Incident
2025-11-04T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwEsJDUid0UEZfG5q14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyhLhyvJv0pvbG_1ch4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxXrs86U4Od9huckHV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy3SeLEcZq3fYJY5914AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwG7lLPDd8pjU4odBx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzccjQoEiJHLn-pTj54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgyUiIvVgcgyDk7nUp14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwPcKFBfxoHtFktAtd4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzdkFwAPdnEEajdAZZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwv7C45-tWaBNvBkR94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]