Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Problem is we dont have super intelligence in our ai , we dont even have intelli…
ytc_Ugz4RiBRM…
G
Those are very old quotes. Elon in particular has backed considerably off from t…
ytr_Ugxqmr8T2…
G
In my opinion (don’t take this as me vouching for prompters I don’t support what…
ytc_UgzaqqvqQ…
G
Maybe if the rich didn’t lobby for coal and against clean energy we wouldn’t hav…
rdc_et82b5s
G
Almost nobody in the industry does boilerplate code anymore anyways, IDEs genera…
ytc_UgwyWYKqa…
G
Just bc ai makes things easier for some people dosent mean it makes things easie…
ytc_Ugy-8OClz…
G
The logical conclusion of the race to automate everything is either the world of…
ytc_UgwLwRMUZ…
G
There is something absolutely fishy about this, i am observing from past 2 weeks…
rdc_mrv267f
Comment
For those who are not truly into Teslas, this article may look incredibly serious, but it only partially is. See, nobody in the video talked about Full Self-Driving versions 12 and 13, which haven’t cause a single accident before almost 6 billion miles were driven. What was shown here was FSD in some frames (turned on by the former employee) but then they didn’t show the actual performance of the system. And that’s because the video revolves around basic Autopilot software, which is obsolete for over 5 years now, showing a lot of limits in recognizing unusual scenarios: the crashes shown during the video by the tesla dashcam are an example. Also, the owner’s manual is like 295 pages, and half of them are warnings telling the owner to always pay attention and to be ready to disengage the system at all times. If you don’t listen to these warnings, even if they officially exist, how can you blame the company at the end of the day? There’s not a single car company that provides a system (FSD, not Autopilot) capable of performing a complete driving task with zero interventions in most of the cases. It is still supervised, and it’s even said on the display when you turn it on for the first time. It’s meant to reduce the driver’s workload, not to replace them, for now. When the Cyberacab will start to operate on public roads, running FSD Unsupervised, and it will cause an accident, then all the questions raised in this and other videos will likely begin to have sense. For now, FSD needs to be treated like if you were a beta tester, even though it has reached incredibly high safety scores, in fact it’s available to all Tesla owners in NA as an optional, and not only to the early access program drivers.
youtube
AI Harm Incident
2024-12-14T08:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgykhBnK03A1cnzoMK14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz_0-FCKcGUog-xMyR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyB0K3BpkDGlIv0-0d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyfqxdC6amIdtgpkxN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugzr2jhU0SC7IdM0fVB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx1VuB22UrMVB9ECqh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgydaUlBFkjLkGIJRpR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNsYdmvLWHKfavuTx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzoM0nD_LE-irFS2tx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxT0dy2tMWQ7ZEFb9N4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]