Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@farzad228 Unfortunately in a world where AI is causing damage to the environme…
ytr_Ugw_pjmxe…
G
LOL. I saw many times how software engineers can’t specify complex IF condition …
ytc_Ugw5wal8o…
G
Possible they're just lying then? I'm sure they're using it, but about the sign…
rdc_o80rh1p
G
Guy didn’t call to resolve a problem. He called just to complain. That’s it, I d…
ytc_Ugy01HETO…
G
is just the beginning.
all the difficult task like accounting and payroll can a…
ytc_Ugy6DAxdv…
G
Complaining that digital is cheating like using AI is like saying Michaelangelo …
ytc_UgyChQq1U…
G
As a creative professional, his response to people worried about losing their jo…
ytc_Ugyk_cC2L…
G
Common sense folks…Gov has a pathetic report card. If anyone in the private sect…
ytc_UgyQp4MZ5…
Comment
To me, the fundamental problem is ther terminology. "Full Self Driving" really means level 5 where a steering wheel and manual controls are unnecessary. Telsa is barely level 3.
Autopilot is a funny one. While the conventional view is that it can completely fly a plane on its own, that is not the case. Autopilot must be supervised by a human just as Tesla's autopilot must be supervised. Unfortunately for Telsa, the common assumption in this case is what is relevant.
Given the driver had been warned by Tesla's system and repeatedly chose to ignore it does put a lot of the blame on the driver in spite of Tesla's shortcomings. Regardless of the manual, the driver knee he was supposed to pay attention and chose not to. That's on him.
youtube
AI Harm Incident
2025-08-16T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyTp4bS-FxEYBWa_2R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugy8lbz2-ZDkN6IZbG14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzPvbGYo29-rcR8b1p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyR5B9KXHgr2nBlMMB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxZgShccTacBLeLF3x4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_Ugw5GL5gHFEFix5GnUR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxeZVWqD4x5xANjm8p4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyBrGwuGA7xpoPWgBB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzoCWixuWHaQ8HgI9t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw6mxPBMtanB5Is5hJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]