Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I geniuenly dont understand why AI that makss images isnt banned yet. Text gener…
ytc_UgywCo8jt…
G
ChatGPT Plus
What microphone do you use, or how do you compress your audio to g…
ytc_UgzcO4_Er…
G
Regarding competition, if Tesla gets versatile enough for everywhere and with a …
ytc_Ugz3zQsgy…
G
Welcome to the matrix 2.0? WW3 where Man Vs. Man with Ai alternated info, is co…
ytc_Ugwv565d8…
G
Large language models are not what I would call real AI. It is just a predictive…
ytc_Ugy6ZWikF…
G
Don't forget, these are the same people who want you to use paper straws, eat me…
rdc_lp6nvwj
G
OpenAI CEO Sam Altman revealed that showing good manners to a ChatGPT model — su…
ytc_UgzN97_WO…
G
I love how such smart people don't acknowledge the existence of movies like Term…
ytc_UgydmFg1s…
Comment
woman crossing in dark with no reflectors, not smart ... however, this is the PERFECT example of why Self Driving cars are supposed to be SAFER. than humans.. any LIDAR system could see this pedestrian in plenty of time to stop or avoid, the implementation of LIDAR on this vehicle FAILED miserably ... unfortunately the safety driver who was supposed to be the fall back ... was not doing their job (she would have had much better visibility to the pedestrian than the grainy video we are viewing .. also, one must ask ... we was this video released publically?).
youtube
AI Harm Incident
2018-03-24T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxy01VX_8QwXy9_57V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzP2TOIEQNrkwHotDp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwjt5cv4iPRLp6BKrF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgyUUkGKDz2ID-KRa1F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzSzbckQWJHdkyLy0F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwkBaLtQi5J43dOVjF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugz0hN4KTF0haVONTDJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwvRS-7NO2QZfGHuJ94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz2x7umeeGIBkOO7Sp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwQLYrOrlprqGY7tD94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]