Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
LOL, this was what I was saying all along. If people think that their AI is goin…
ytc_UgwGeei1M…
G
Imagine trying to save artists' jobs by banning ai art and banning real artists …
ytc_UgwILuGoC…
G
I saw dall e 2 and it's actually impressive
I mean I could mistake it for real a…
ytc_UgznaTgkn…
G
I don't know how many people will see this or if anyone will know about it, but …
ytc_Ugy6hH5xB…
G
Ai can barely generate the word pie in their art, idk how tf an AI movie is gonn…
ytc_Ugz-Z6nwl…
G
Anyone who believes this guy is dumb or gullible. They will do bad things and bl…
ytr_UgxbS0pEy…
G
✅ Mutual Oversight Operational Checklist
Required for every AI system involved …
ytc_UgyjLzJSC…
G
Submission statement: “Technology always makes more and better jobs for horses
…
rdc_mrr459q
Comment
If enough self-driving cars are on the street and they can talk to each other, several cars will make wild maneuvers to avoid the accident together. For example, the truck might brake while your car accelerates to catch the object before it drops on the street. Cars on the right might make space and clip the object to bounce it off the street. Your car might talk to the SUV to generate a space for you to squeeze into without the need to crash. All the cars behind you will brake to create more reaction time.
Remember: The goal here isn't 0% accidents, ever. The goal is to have more options when something happens or before something happens. A human driver might not notice when the cargo starts to shift but sensors can do it. Sensors don't get bored or tired. Sensors could stop the truck from moving unless the cargo is secured properly. The autonomous truck could run tests, like going to 5 MPH and then doing a full break.
Self driving cars might go into the opposite traffic to avoid an accident because they have the reaction time necessary. It would freak the hell out of the passengers but it's actually quite harmless. The car would know the actual risks because it would have talked before to all the cars in the incoming traffic. So it would know which ones would cooperate and which ones it has to avoid.
Instead, we need to make sure that drivers can't take control of these systems to "have fun" like racing into ongoing traffic for thrills.
youtube
AI Harm Incident
2015-12-08T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UghzuMMpBbsZkngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughhc-RnxMS1LXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjY6HJikXmw-HgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghIQezVaUOb-3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgicExh_IjSyAXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjRcySEHSlNsngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugha7FPAvBu3AngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UggqSiIbUqJPI3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UghJ2QECp_kzO3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghQd27Kawk0s3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]