Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In this discussion it sounds as though you are making the AI the problem - direc…
ytc_Ugz8CQcIg…
G
If I want to use AI to make a profile Picture I can, it doesn't effect any one…
ytc_Ugxgh7eju…
G
Ai can be responsible if it can be taught about life and death and how we accept…
ytc_Ugz-6JCmW…
G
Id understand if the same thought caused pain but its random. You're telling me …
ytr_UgwwHitZF…
G
He's clean wearing nice clothes...If you wanna see some crazy filthy drug addict…
ytc_Ugwgw-kHS…
G
A lot of people seem to still be under the impression that an “AI” is an actual …
ytc_UgxG3349M…
G
*Why does an American need 3 robots?
Answer: because a robot alone doesn't have …
ytc_UgyvvcI9X…
G
I just couldn't understand the appeal of AI, the reason why I love real art crea…
ytc_Ugwk8vfnM…
Comment
If this guy was driving a ford and showed up to court and tried to sue ford because the cruise control on his ford Explorer didn’t stop for a stop sign they would laugh him out of the courtroom but because it’s Tesla it’s a win. If anyone is looking at this case other than the way I just stated it you’re dead wrong, Tesla autopilot is equivalent to cruise control in other vehicles. If you don’t expect your cruise control to stop for stop signs then why should you expect Tesla? Because it has the word “ auto” in its name? cruise control has the word “ control” but it doesn’t “ control your vehicle buts that is the precedent the plaintiffs is setting . If this case wins, then someone can successfully sue ford for running into the back of a car while on cruise control, and then sue and say since the driving feature had “ control” in its name I thought it would take control and avoid an accident, ridiculous
youtube
AI Harm Incident
2025-10-20T00:5…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxVSRevNfpEsXy-Kah4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgygwhEpahYKg8OZLLB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyCtcsEgGjHLJT5PpF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxpfcXZ1cVY8T8WN7V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxaEv9AjrNXYNE10NZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyj5zO0aDUmpAxCdEZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxZVLm5WJguL8LuB9x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxiM8DVfMbWsCvF79h4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyvFkngF8l8gB30U4p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw6gOyk_7dZLjcExOF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]