Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@h83301Those are run by people if ai takes over there how will they pay him mon…
ytr_UgxyYl4gJ…
G
An LLM hallucination seems kind of like a human rationalization. People often ju…
ytc_UgzJGxZes…
G
Because we don't have a definition of intelligence to start with, making any pre…
ytc_Ugxk5Fe-c…
G
"the fraudulent acquisition and use of a person's private identifying informatio…
ytr_UgyWubqwK…
G
We have teachers, people working on the medical field, we have unemployed people…
ytc_UggB6jYNN…
G
THE KEY ELEMENT IS THE TRAINING ... WE SEE BAD TRAINING IN THE LIKES OF DICTATOR…
ytc_UgzwCZr0U…
G
sadly with anything trying to fight against ai its only a matter of time before …
ytc_UgwjuYf70…
G
I remember Issac Asimov. When I was a kid back in the 1950's he was predicting t…
ytc_Ugx6s8l_Y…
Comment
IF the autopilot fully disengages when you do an action like speed up and etc. Then i wouldnt blame the auto pilot at all. If it DOESNT disengage and SHOULD work normally just speed up then yes there is 100% fault on the auto. Question is Would he have crashed if he didnt do any input and let it on autopilot in this exact same situation? I would love to see a test without a person in the car obviously and the exact same situation EXCEPT the speeding up if it crashes Tesla is 100% is guilty if it doesnt crash its kinda his fault?? But I think the automatic system should always be ON at all times to break when it detects something. It should ALWAYS be on. even if autopilot is off. I will add I think you should always pay some attention to the road even on auto pilot I would NEVER trust it 100% no matter how its perfect because You can break and save yourself if something happens as long as you pay attention. But thats just me.
youtube
AI Harm Incident
2025-08-15T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw3SaUrZX2yOzQ_6Oh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxVHhzoJA5ZGTRNuXB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgysDxWZFwMGJvuStPt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwVm3rQ3D6BcwQpuwx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyfVw9u8F179VQgM8J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxv1KqFmv4nmdmGM5F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxVNOUcSMXW2HAejnZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyMPT61al517ReEEip4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxQ4DmOOAHfyLa3tvR4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyK1M9lzunLIii4XUd4AaABAg","responsibility":"company","reasoning":"unclear","policy":"regulate","emotion":"fear"}
]