Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If all cars were self-driving, it would save countless lives, and someday that w…
ytc_Ugxa6LHN3…
G
Not at all. 7:26 Alex said chatgpt said I apologise. When? Same with I’m excited…
ytr_UgxAhIlp3…
G
It is coming, folks. China wants to spread this disease all over the globe. They…
ytc_UgwuIaI9d…
G
The claim that AI is currently conscious is absurd. The odds of identifying a co…
ytc_UgzXEgLjC…
G
I USE A.I art generators, and I didn't understand why it's such a problem. Thank…
ytc_UgyplD0As…
G
Has anyone every thought that because they are asking those questions now the ro…
ytc_UgwghBCdK…
G
Artist don't give a fuck if you have ethical models that were trained on consent…
ytr_UgwiaMo60…
G
his excuse was so weird to lmao "b-but i was researching ai pn! a-and i accident…
ytc_UgzIU7wYD…
Comment
You do not need a robot to drive to reduce collisions, injuries, fatalities; all you have to do is good road design, and train your drivers. Car companies, lobbying, and "car culture", seem to mean these are not investigated, people seem to accept it for whatever reason.
It is not simply due to having a large population, because the proportion of collisions are way higher than other countries.
youtube
2025-08-03T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwaIqEdqvGpgMnoRox4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxg8eoz6uWAY4Vts5x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzXoTRTs4NyUYIzD-V4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyGfKYmwlmUbKAurr94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz1EzsX52qGFhmzTj14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxItKSnKrWDTzRYiIV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy2t7wwi2WJY7mYVrt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwtvnlIix9XlX0D6nF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwsqp_APbqNCBmxuLB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzFHVCIlHdR22ck5Sd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]