Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes, I understand that but people use this dilemma as a reason why we should not…
ytr_UgzhU9chL…
G
Is this not also the same person who, not even a couple years ago was telling us…
ytc_Ugz6boeZx…
G
Stupid AI engineers don't understand that if they will be undistinguisable than …
ytc_UgxE8M4SB…
G
It's the fault of big companies wanting to use AI to save money and time and cop…
ytc_Ugx4xGdbg…
G
I usually agree with Mr deGrasse, but I cannot do so with the indifferent attitu…
ytc_Ugw1vkD0c…
G
I understand u don't actually believe its conscious just trying to trick it but …
ytc_Ugyq6_xQH…
G
Isaac Asimov's Three Laws of Robotics, introduced in his 1942 short story "Runar…
ytc_UgzIi_DRn…
G
I wonder how long until people really that the A in A.I. means artificial or sim…
ytc_UgxpSuLLm…
Comment
This was a perfect storm of a complete failure by all 3 ... the pedestrian was walking right in front of a car at night, that the professional "safety driver" was not paying attention at the exact same time and so illegally relying 100% on the self driving technology which obviously still isnt capable to avoid all collisions, otherwise they would not have been out there even doing this testing... maybe next time they try out this test they should send the Uber CEO out in the dark to stand directly in front of the 40 mph self driving car, instead of trying to perfect it with ordinary citizen jaywalkers used as test dummies in real world conditions ? i think in time they can actually probably make it work by using their lidar and radar etc. because that can sense better in the dark and around corners and obstacles better than any person could with their eyes, but if the consequences in a failure are a bit higher for themselves that can help them to work harder on it too...
youtube
AI Harm Incident
2019-11-13T06:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzIZWKU_29pIsbtnqZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwwv6PTvh9Toy8lGpR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwpFxWKuXLG7WTx4f14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxeYsPHofFY5shuHpV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyqChGPq8PNh3BxWVt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw8uPE6BqQ9SuYoPxh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw08XGO72WtieHOMTl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxELwttJKt8QfPXgOt4AaABAg","responsibility":"user","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyG776EsfrVrjb8qex4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz1iM-qSU0GQXJxPD14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]