Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As a truck driver who's been doing this for years, I see this as a safety issue.…
ytc_UghWN1dIL…
G
How about attending a national alliance on mental illness meeting and you’ll see…
ytc_UgzUNl8BE…
G
In a sense; humans are dicks and will probably be destroyed by our own ignorance…
ytc_UgiEtmKfy…
G
It can be a whole mix of things, for instance, if the volcanic activity that hel…
rdc_d2yyc4q
G
The real money right now in AI is creating make believe doomsday scenarios and p…
ytc_UgyN2Z0H4…
G
That's not what will happen. In fact the arguments in play at the moment will le…
ytr_UgxjBIPS-…
G
AI is evolving so fast, I’m seriously considering giving up tech and becoming a …
ytc_UgzDMCkap…
G
AI HAS MANS INTELLIGENCE ☠️. HMMM IT GOES BACK THOUSANDS OF YEARS TO THE FORBIDD…
ytc_UgwwgBhKk…
Comment
So the basic question posed: Should you randomly crush into sth, or should let the car take the action which leads to minimal damage? Yeah i know, it's a hard one. Sometimes there are no good choices. One outcome may be that you go straight for the things falling. It minimizes causalties, what's wrong with that.
When you're in the car you try to compute the causalties and minimize them yourself. The difference is that you only think about yourself, because you don't have time for complicated thoughts. Sometimes in your attemp to avoid the accident you drive off the side of the road on the mountain. Would you rather to have the car roll a dice and decide like we do?
We already try the least causalties. E.g. Pedestrians ought to be avoided. But nobody frames it as: "But if you avoid the pedestrian that jump like a fool in front of the car, you will hit another car in which case you punish them for not dying as easily..
And i have one last note. Self-driving cars are going to follow driving rules. One of them is to keep proper distance so you can hit the brakes in time. But i guess there could be a scenario where the decision above must be made so while not dismissing the expirement, i would like a better example.
youtube
AI Harm Incident
2015-12-13T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgjvJ6NnfmEbp3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggXZfa6C2KKR3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UghNp5BGhWiGfHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugicy25a1_k_VngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UggVXqoniLKpUngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugi6d151MupypngCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ughowk26EsgP-ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjPUX-D7spJtngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiXjYc4IT9HpHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugh_gj4KR5_ORngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]