Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you want to drive...drive, if you don't? Hire a cab....this AI crap is deadly…
ytc_Ugw5yObIX…
G
i agree that ai art should stay in its corner and never replace artists in any w…
ytc_Ugy4S4l3H…
G
"Be a plumber"
Little scoff. Real nice. Cant wait to see you lose your job…
ytc_UgyqIxdcn…
G
They are emptying old stock to justify production of new stock and design of new…
rdc_mcqteu2
G
Artificial Intelligence will work as long as nuclear power countries not decide …
ytc_UgxjUC4NE…
G
It is unnerving. I don’t think they should have human features. I believe we sho…
ytc_UgytZXL8Z…
G
And he is right. AI is just the next thing added to the list that they say is go…
rdc_mt7uebt
G
Reason? Insourcing. Companies like Microsoft firing 10,000 Americans and fly in …
ytc_UgzkAzKno…
Comment
Not going on the side of Tesla as the car did register the stop and car so it should of stopped but there is also a reason USA laws, generally all states, say you must not be negligent in driving, aka, keep eyes on road, hand on steering wheel, etc. Few states allow for fully autonomous vehicles and yes Florida is one of them. So my guess is the court has to decide if Tesla's self driving feature is a driving assist or a autonomous vehicle. If the court decides with an assist, the driver is responsible, if an autonomous vehicle, then the company is liable but this also gets more complicated as you are required special insurance for autonomous vehicles which just means the vehicle was operating with no insurance and I couldn't say whose at fault, or where it ends.
youtube
AI Harm Incident
2025-09-12T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx2Mv8TukwB_VeXp5F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwh4Fu7UDRAU7pAi3F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugz1l6tgW8rZ2O6IvNR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxgJ8QZX3PfSZD4tah4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw7G17Z3BZ8dJT_syl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwRWiRPSeOxdyV-BY94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwflz_W5JsX-kJlGa94AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugwy8FJEbVVI2EFEf594AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzvndmsEWPB0d9uFaF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgzxH7EpIvK13GD55Jl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}
]