Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As a white guy in IT who tries to be non-racist & anti-racist, this does not sur…
ytc_UgxdQgj1d…
G
My art teacher makes up use Ai to generate art and I hate it. :)…
ytc_UgxgN3b6b…
G
Propaganda "but we're just raising true AI" is distraction from real problems re…
ytc_UgxSjIu2V…
G
@TheAngryDesigner Spreading fear is not necessary. Coming up with progressive ad…
ytr_Ugxn5pZUy…
G
Why would they wanna be working class any longer?! Why would a Lyft driver wanna…
ytc_UgzH53xeE…
G
thanks for sharing this!! I love this series and it gives me so much hope on the…
ytc_UgyUopmDP…
G
Earth is about 4.54 billion years old. 🌍
Modern humans: ~300,000 years
Human gen…
ytc_UgyX-mGED…
G
Yeah, I'm imagining what's going to happen for a while is that they'll use AI to…
rdc_jir9725
Comment
You accident example is wrong, object falling from the truck will have speed and direction similar to the truck it is falling from, so you have enough time to slow down the car. And generaly, why is everybody asking these questions how will self driving car decide who to kill? How often these kind of accidenst actually happen? Once a week? Once a monts? In the entire world?
Self driving car will have huge advantage over human driver, it checks situation on the road 100 or 1000 times per second, it knows position of all cars around you, it does not get distracted, or sleepy, angry, drunk or sends text messages while driving, and it will react instantly, compared to driver who will need at least half a second (car can travel 3-4 car lenghts in that time).
youtube
AI Harm Incident
2015-12-08T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggmIyJ8SloWNngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghTisOhXvg2MXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggJ7uf4xwzHrHgCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughd4nDqmE0otngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghTs3eIZEp4CXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiT1_uxg4Qf93gCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgiiYSCGtUOQQ3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ughs2ea7-kE5XHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugj_gIAyUkWWl3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggO5i8Su4Fd-HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]