Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What an interesting analogy. "--I wouldn't want ai to be so obsessed with effica…
ytc_Ugx2HVW7t…
G
Being an AI ‘artist’ is like getting someone else to do your homework for you an…
ytc_Ugzn0QrC5…
G
what I am hearing: PSA: IF YOUR TECH EMPLOYER IS TRYING TO MAKE YOU BUILD AI AGE…
ytc_Ugyb-Ag35…
G
Anyone that says we don’t know how AI works is either trying to sell you on AI a…
ytc_Ugz3NIfB6…
G
12:00 section is completely stupid from your side. "Hey, guys let's compare pist…
ytc_UgxXOGW8Z…
G
intrastructure, without robots producing hardware and deploying and controling t…
ytc_UgxP-xn7w…
G
Ai neither makes decisions nor it creates anything by itself. It only appears to…
ytc_UgwrW6SdD…
G
@IQBytewhat i heard is doctors demand is infinite or too large. AI probably can…
ytr_UgwEwfA1W…
Comment
+Steve C The example is flawed. Just program the AI not to let itself get boxed in, and not to follow a truck closely enough that something falling off the back would be a serious hazard. Done. Next problem?
See this is the issue, we can just program the AI to deal with these situations without causing accidents. ALL the examples of this dilemma I've seen are ones a self driving car will 100% of the time avoid in the first place. The kinds of accidents they will have is when another car swerves into them for literally no reason, or the brakes fail. ie. situations so rare that it almost doesn't matter how they react.
Besides swerving is innately dangerous. You're almost always better of braking in a straight line. I think the people coming up with these dilemmas aren't transport engineers. These are problems humans have, not AIs.
youtube
AI Harm Incident
2015-12-10T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgggitcG_CbrUXgCoAEC.87WDKCb8uB_87Wc8Git_dg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytr_UgiwrXyY1rZ69XgCoAEC.87WAOkvgKe087ZUsAi0XrE","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgiwrXyY1rZ69XgCoAEC.87WAOkvgKe087Zk-pe2Tvs","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgjFbGyekK77fngCoAEC.87VyK9Y8dlO87WlJMEeWoL","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgiiYSCGtUOQQ3gCoAEC.87VxmXkagiW87W5Qy0QLaG","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_UghlFJ0pJ4lt_ngCoAEC.87Vxapt4mjd87W8NWV8_40","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UghfG4rKbyLwlXgCoAEC.87VxK1IcVdT87W9BZPSPbn","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"mixed"},
{"id":"ytr_UghiyJc91JDD0XgCoAEC.87Vw0Yij8ek87VyZ_yCQtY","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytr_UghiyJc91JDD0XgCoAEC.87Vw0Yij8ek87VzogUdvsL","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugg-LAtK9Y2urngCoAEC.87VuXpLFzck87VxWEbSmkv","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]