Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is going to destroy the human species. Once unleashed, there’s no scenario th…
ytc_UgzXx4eYl…
G
I knew Ai or A1 was a powerful destructive force when I got mortal kombat 2 for …
ytc_UgwUpyHJ1…
G
2030 the only jobs left for humans require compassion, anything else can be done…
ytc_Ugx3DouJs…
G
It’s happening because we know microsoft is forcing is devs to vibe through all …
ytc_Ugy9FhW9G…
G
These AI remind me of the movie Short Circuit. A robot with a supercomputer, gat…
ytc_UgwG-jFB8…
G
The skin needs to be slightly translucent to let light reflect beneath it, and t…
ytc_UgySCc0kC…
G
You see the people in these comments with their oh so very original opinions, an…
ytc_UgzNGaw4z…
G
Artists makes something out of nothing, Ai bros are nothing, nothing angry that …
ytc_UgwRvxOtU…
Comment
Nice job, Uber, you've finally managed to kill someone all by yourselves. May you be sued into the poor house.
You're taking 80% and more of trip fares, despite the agreed upon (and excessive) 20-30% drivers agreed to forfeit for taking 100% of the risk for you. Now you've managed to actually kill someone with your mindless robot.
How did the vehicle fail? The spokes of the pedestrian's bicycle fouled the sensors, it could not tell that it was approaching a solid object, if it even registered that there was an object ahead.
youtube
2018-03-20T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugza7KWbDsEmvpmyFRh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCV1WSHNNoAWEQIYB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxEKipMBtlquQ73dAJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw7p_ybxetNLmd20S14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzbaqcv6M2eQ5jR0Mp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugxn07BJeDe9qC1QmI14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwXrNrt8uWT_CDG5XB4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugw_YyfGTzNseYQGsWR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyxresGdU-QPZc5VMJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxbVYViITHfX4_ucmV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]