Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
All AI has done has made me acquire more work and responsibility for the same pa…
ytc_UgwsntG3S…
G
That reminds me of that that time I was looking at my wallpaper which was the p…
ytc_UgwOVauFv…
G
If YouTube is going to profit by A.I.-generated lies and anti-science, then pret…
ytc_UgyuZxaO-…
G
It's interesting how the people who understand, most superficially, the least ab…
ytc_UgxvZf9Dc…
G
> Would you rather send an army of robots to burn down another country or you…
rdc_dwvtywi
G
I know AI can't write a joke, but I never knew it was so good at drama...…
ytc_UgwhlckIe…
G
he's not talking about the middle class... those are the bottom rung jobs.. ai i…
ytc_UgxAA3Vg6…
G
It's called REDUNDANCY. Your life is much more safer if the vehicle you are in …
ytc_UgyXLaub5…
Comment
The solution is easy... you program in ALL the possible decisions that the car can make in a situation like that and if an accident happens you make it choose *randomly* so that there's no "moral blame" to the programmers. The number of lives that self-driving cars will save *FAR* outweighs these nitpicky moral dilemmas. The fact that you're using a safer device overall that reduces the chances of hurting people justifies the rare crashes that will happen.
youtube
AI Harm Incident
2015-12-16T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugh8rhSAIlTrjHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj8lU9CWdFWf3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghFWNaMvDiVGngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgggdoqiWWgg1HgCoAEC","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggYKBs14QZPoHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi9xnzyNGEqq3gCoAEC","responsibility":"society","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugim4SKNBlRtfHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjXjm0R3slUzXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghxbQR1FcrERngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgipNWetGSuz7ngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]