Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How did you set this up? Is this a special version of ChatGPT? As far as I know,…
ytc_Ugxkb5caz…
G
i would very much like to use ai right now to simplify whatever this show was ab…
ytc_UgxioaXGz…
G
If your architect built a neural network that produces lies then there is a flaw…
ytc_UgzeXwT00…
G
"Using AI I can show the world I can also do it" but you only using AI because y…
ytc_Ugwq7S_XK…
G
There’s a movie called “ROBOTS” just released is about this….more in depth Cz th…
ytc_UgzTxWe3f…
G
Wanna rant about my mother using an AI app to generate images for her Instagram …
ytc_UgySZbHNS…
G
I believe AI would examine every input whereas humans might miss or not consider…
rdc_i2vhhec
G
- Also when you read about the history of machine learning, it's clear that pro…
ytc_Ugz8UOpdn…
Comment
My friend’s son was killed in January while driving a Tesla. I’m not sure if autopilot or self-driving were turned on at the moment of the crash. The accident was deemed the other driver’s fault for running a red light. However, my friend has confided in me several times that he wonders whether self driving was on and whether it had an opportunity to avoid the accident, but didn’t.
I’ve discussed consulting a lawyer about that, but my friend is very hesitant about that.
youtube
AI Harm Incident
2025-08-15T19:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwuRx9UpPhP587tdo14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgzSjj9Tp60Cr89I_tZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugy495fkc9ChMossIzB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugyk1L8QXTLa0ZHUmjh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw6Jio8EXR8fpft5eR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwRgotJplF-O_rekRx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxwOlyGLFQVbRUGKN54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwY0Ozrgn3a-9CpDex4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx9UWRQflY_Lol5RJp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwKsTeKHXeqYl2q-kd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"})