Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Then, what is Artificial intelligence? This is not absolutely right answer, howe…
ytc_UgxOTQifu…
G
this is all scam altman's fault. they've anthropomorphosized ai to a level of m…
ytc_UgxdESwER…
G
Enough with the fearmongering.
I’ve used ChatGPT nearly every single day for a …
ytc_UgyRqN2Zc…
G
@TheFreshSpam An interesting way to argue - you repeat my decision, only for som…
ytr_UgxajNJzt…
G
Step 1: Create a super-powerful benevolent/neutral AI before a super-powerful m…
ytc_UgzCdwe6C…
G
When I say to people my productivity has increased by 10x, sometimes 100x,
they …
ytc_UgxQh9Pd-…
G
Ai might not be able to create a different original like that but if trained on …
ytc_UgzWpuBbD…
G
As a fellow Brit, I’m sad to say I fucking hate our country. Currently looking t…
rdc_fwhmw7z
Comment
My two reactions:
1) I use FSD like I use any other tool, extremely carefully and with a constant thought of "how could this go wrong next?"
2) Motorcycle deaths per year fluctuate between 3,000 and 5,000 since the 1970's, with 50,000 to 100,000 injuries per year. I think it is an important question to ask, without discounting the tragedy of these two deaths, why are so many focused on these two? Because the car was doing the driving?
What about the unseen hundreds of thousands or millions of accidents that didn't happen because of FSD?
With or without FSD (or even a future, better AI system), the owner/driver of the car MUST be responsible for the accident, just as in any other case. A car is a tool - the wielder must be held responsible and not the toolmaker.
youtube
AI Harm Incident
2022-09-26T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzVJfG-AVRs_IjtUR14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"concern"},
{"id":"ytc_UgyFcFo17Iz77RZ7kxN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyuHHOYbje74v-LEQ94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy44Uu_6CM87kC2D9B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxOMp39mj7AX6QaKVR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxDwB-gzJ2lsZ4jLJJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxkLA7IekFRrfXui2Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxigWtmNHS395_MJaN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyQESbNb8A-IGnM0zx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgyD4joqCj_i47H5kvV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]