Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Musk doesn't have a moral compass? Well... I guess he's absolutely right about…
ytc_UgyEdeqTy…
G
Your counter arguments are so on point I feel like I'm being convinced even thou…
ytc_UgxcBeaW1…
G
In my experience, I'm seeing ChatGPT and Copilot tools offer amazing boost to jr…
ytc_UgyBJRSiw…
G
I think many ppl jumping on this trend don't actually mean to humanize robots. I…
ytc_UgyO_GLK-…
G
I hate the term "AI Art". I hate it both when it is used by people who call them…
ytc_Ugw1n6qvI…
G
God gave us a brain so we can think for ourselves. So why do we create a machine…
ytc_Ugxt0TuoT…
G
I don't even have that much experience post graduation but that is what it looks…
rdc_kuoy7oi
G
I remember the time when we thought self driving cars would never exist. Now we …
ytc_UgxeO8eCI…
Comment
So, the driver floored the accelerator within 30 seconds of the crash, he wasn't looking at the road, he was rummaging around on the floor, and he ignored the warnings provided by the vehicle when you activate the automatic steering on your profile (and any time you take your hands off the wheel) saying that auto-steering must always be supervised by the driver, and the driver must be prepared to take over at any time. We have a Telsa and these warnings are clear; you must pay attention. These aren't just in the manual, they are displayed on the screen before you are allowed to activate the automatic driving mode. I find the argument that there wasn't sufficient warning strange given our experience.
youtube
AI Harm Incident
2025-08-15T19:1…
♥ 12
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz05N2k2HstAfTObnx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxiqWl2KCLwiQ-LuHx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy-vVh8EgGFGv9hPXp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzBgr5InhwMYeiq5CF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyTq7JuerURsRbTaup4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzxC_zUpmJISuupmWd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzMGVyHLrEuoXa4Ffl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz8fhtkzxtBvuNqo0x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz1OXVGATHuqOwrerV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgysO5ZhCSRvNMyFEuF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]