Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You just burnt your brand for kid friendly fun entertainment. Ad for a doomsday …
ytc_UgxD1MFK4…
G
All the ai bros are crying. And it makes me happy. I can dunk harder on these ai…
ytc_UgwkbQmgo…
G
Good spin! "oh yeah we're totally not hiring because of advanced AI not because …
rdc_m2a8us4
G
This is hyperbole, the threat isn't AI. The threat is authoritarian religious fo…
ytc_UgwufUzn0…
G
China is odd. I'm an optician and today a guy came in wanting to buy contacts wi…
rdc_dv6ttby
G
Elon Musk said he BEGGED all the proper authorities to rethink all the dependanc…
ytc_UgxSUDXxP…
G
Why would you program an expensive robot to clean toilets when you can pay a des…
ytc_UgyVUGF-0…
G
>Real AI will behave like Futurama's Bender.
"bite my generated image of a s…
rdc_ks20x5n
Comment
In my opinion it is clearly the fault of the driver. The bike and the woman can't be seen in the dark, but ceep in mind that cameras often see less than the human eye at night (Same street with another camera: https://youtu.be/CRW0q8i3u6E ). And even IF the sight were as bad as in the video, you are not allowed to drive faster than the point "stopping distance (with reaction time) = distance of sight". This applys also to automatic cars, because otherwise their driver can't react if there is an failure. Which was here the case. Driver distracted, Lidar and sensor failure (car didn't brake at all, Lidar systems usually working better at night). Uber develops their cars on the street.... Other companys are using slow cars (Google) or test courses not without reason.
youtube
AI Harm Incident
2018-03-23T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzCQ44Cg1Md9zU1EhF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzpsJ__r0N7gbEHzNp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwlQI3p3kX7MPYT5Y54AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx827I7qz11nS6-KKN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyQL2iyLyyfUKTYHuV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwOp0N0eJQbbJGK7FV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxDiYRj-r_H2-7ZEDh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugyr7nzZGq9xk_Jlyr94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwoCxiJBTSmYhvxVNh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzPiLcv4IIOYGr7E4B4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"mixed"})