Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai will solve gloal warming. Rich owners will check the math and quickly get rid…
ytc_UgyN5u9NO…
G
The City of Austin has a [PDF of Native and Adapted Plants](https://www.austinte…
rdc_eh6l7np
G
Born in the 2000's, we were told to commit to our passions and get a job out of …
ytc_Ugzrbm7N8…
G
I want one so I can roll play on a rainy night, wearing a trench coat. Ill hav…
ytc_Ugy2P8PC8…
G
I guess men now can sexually assault their AI robots. Sounds like a great inven…
ytc_UgxIeip1g…
G
But it sucks, it with the mass amount of resources is trash. It is more than li…
ytc_UgxFN82BO…
G
+Jim while yes that is true the effects would be similar to the umbrella effect …
ytr_Uggozw99v…
G
Just imagine, the simplest explanation of how AI works took dam near a half hour…
ytc_Ugx3GhKot…
Comment
I know I've come to this video late, but I wonder if another potential way of breaking the tail light assumption would be to have one of those helmet-mounted rear lights like the Brake Free. Would a third light be enough to convince the AI that it isn't a far off car? Or if you're a short enough rider or in a low enough riding position, would it just continue to think the same, assuming that the third light is just the LED brake light that exists at the top of most cars rear windows? It might at least make it think that the far off car is braking, but if it thinks that it's far enough off, that might not actually change the AI's decision making any.
youtube
AI Harm Incident
2025-08-20T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxwwikthp6FhSlvY3h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzv23JEgPCM8GG40zF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz7uwgSTAYN3gGihet4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwhvBRCjU6kRnPWjpt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwRdvXW334LWfme2kl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwNwm-sIm5BdnnZgqt4AaABAg","responsibility":"company","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy1JEbwJbC7k4Ct8pV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmUbIEEAhmOmdLFbF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyrDmyEThIMTXFgrxd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzfChESkctwF8zzrLZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]