Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Meet Agility Robotics' Digit 2.0, a human-centric, multi-purpose #robot made for…
ytc_UgxirpTzf…
G
The latest news is that Tesla is now giving up on the promise of fully autonomou…
ytc_Ugw4a-Xy9…
G
What app are you using? I have the official ChatGPT app on Android and it does n…
ytc_Ugyvnjp2F…
G
I know you think this is a great counter argument, but it’s honestly making us f…
ytr_UgwB87kpg…
G
Im sorry dude, but your takes are bad here.
There is no actual difference AI ar…
ytc_UgyqgjloL…
G
I recently wrote a rhetorical analysis of President Roosevelt's speech at the de…
ytc_UgzP4SpJW…
G
so the AI was right, he was 99.9% likely to be involved in a shooting.....…
ytc_UgyYXBUQN…
G
@SarahNGeti wow you really are naive aren’t you? Free will? Self purpose? Go st…
ytr_Ugx5i7PTW…
Comment
Although this is a hypothetical scenario, it is too hypothetical to really be worth discussing. Why is the car following so close that it can't brake in time? And any object falling off the truck is still moving forwards, I'm sure the car can stop faster than any object. The car would never put the occupant at risk by following at such a distance anyway.
I've heard other scenarios as well and they are all avoidable if the car is driving safely to begin with.
You may as well be asking, "When my self driving car drives off a cliff, should it play show tunes to lighten the mood?". Its a ridiculous scenario! How would a car have access to show tunes 😜
youtube
AI Harm Incident
2015-12-11T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugjwkh7gbtadm3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UghlKJ8Nc_INgHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UghjkWiCvWeo1ngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghilDXtRwfSCngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjN2KgJTlwlC3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh54ZJdEXZfwngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjWA5kpI1F_UHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjB5N2AWV6PlHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugh2zj0x13RnS3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgiaFVeDpzC9U3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]