Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How can people not spot that the lip movements in these deep fakes are really ba…
ytc_UgzenKX72…
G
Tested your theory that AI couldn't make anything new by opening ChatGPT for one…
ytc_UgzZHt9Dj…
G
Call me a hater cause i am. I despise the argument that ai is more accessible an…
ytc_UgxpSafk9…
G
I dont think ai is truly a horrible thing, i just think its misused and needs re…
ytc_UgxMDrkNI…
G
So the BIG problem - we as humans have - is the old usual. Money and Power. Wond…
ytc_UgyyvN5QA…
G
Artists: Hehehe your job is getting stolen by a machine, I'm so glad I'm safe fr…
ytc_UgwIDw7YW…
G
😂 Yes, it least part of it ⁉️
😂 ☠️💰 GREED is WAR and is widely CONSIDERED a SIN …
ytc_Ugz9UFXJc…
G
There are many downsides but do not forget the big one that will affect everyone…
ytc_UgzwM5yNe…
Comment
A Shocking Display of Misinformation:!!!! The WSJ's Takedown of Tesla's Autopilot :)))#########
Now for the truth!:::)
The Wall Street Journal's attempt to dissect Tesla's Autopilot through "The Hidden Autopilot Data That Reveals Why Teslas Crash" is nothing short of a sensationalist disaster, masquerading as journalism. Here's why this piece is not just a miss but a deliberate misstep in the narrative around autonomous driving technology:
Skewed Data Analysis: This video, which smacks of agenda-driven reporting, cherry-picks data to paint Tesla's Autopilot in the worst possible light. Instead of a balanced examination, it focuses on the most dramatic crash scenarios without considering the broader context of millions of miles driven safely under Autopilot. It’s like reporting on airplane safety by only showing the crashes and completely ignoring the millions of successful flights. The selective use of data here is not just misleading; it's manipulative.
Lack of Technical Nuance: The WSJ seems to have forgotten that Autopilot is a Level 2 autonomous system, which by design requires human oversight. The narrative pushed by the video suggests a failure in the technology itself, ignoring the human element in these accidents. This critical oversight screams of either ignorance or intentional misinformation. If you're going to critique a technology, at least understand its basic operational parameters.
Biased Expert Selection: The experts chosen for this piece are predominantly those with a history of skepticism towards Tesla's approach to autonomy. There's an absence of voices from within the industry who might offer a more balanced perspective on Tesla's vision and the actual progress in autonomous technology. This one-sided narrative does a disservice to viewers looking for an informed discussion rather than a hit piece.
Fear-Mongering over Education: Rather than educating the public on how to use Autopilot correctly or discussing the technological advancements and safety improvements, this video opts to scare viewers away from considering electric vehicles as a viable future. It's a portrayal that seems more interested in clicks than in fostering understanding or progress. The lack of contextualization about how other ADAS (Advanced Driver-Assistance Systems) compare in similar scenarios is glaringly absent.
Overdramatic Presentation: The style of the video, with its ominous music and dramatic reenactments, borders on tabloid journalism. It’s clear this isn't about informing but about scaring. This is journalism at its lowest, where the goal seems to be to stir panic rather than provoke thoughtful discourse.
Conclusion: In an era where technology like Tesla's Autopilot is pushing the boundaries of what's possible in automotive safety, this WSJ piece opts for the lowest road—sensationalism over substance. It not only fails to provide a comprehensive view but actively misleads by ignoring the significant safety enhancements and the potential of autonomous driving. This isn't just bad journalism; it's irresponsible journalism that could hinder technological advancement under the guise of public safety.
If you're looking for an honest, in-depth analysis of Tesla's Autopilot, look elsewhere. This video is a hatchet job, plain and simple, wrapped in the guise of investigative journalism. Avoid at all costs unless you're in the mood for a dramatized, fear-inducing narrative devoid of real insight.!!!!
youtube
AI Harm Incident
2024-12-25T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxeDMh2veO0vdgddUB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzEi-wkx72pocjaUxp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzFGh9XO_QPpfNx9VZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwQ1uX8HN4SCj7eXlx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwOCyVn3W8gLmDCdq54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxQqSkrC0eoZkQ6cm14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxPrFXOO5UiHefi5Sl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw95l8Dg7vzVRzEVhV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugxygs6IEKWqPm4wXhF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyruD0p4_jyzD7ih2l4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"none","emotion":"mixed"})