Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If enough self-driving cars are on the street and they can talk to each other, several cars will make wild maneuvers to avoid the accident together. For example, the truck might brake while your car accelerates to catch the object before it drops on the street. Cars on the right might make space and clip the object to bounce it off the street. Your car might talk to the SUV to generate a space for you to squeeze into without the need to crash. All the cars behind you will brake to create more reaction time. Remember: The goal here isn't 0% accidents, ever. The goal is to have more options when something happens or before something happens. A human driver might not notice when the cargo starts to shift but sensors can do it. Sensors don't get bored or tired. Sensors could stop the truck from moving unless the cargo is secured properly. The autonomous truck could run tests, like going to 5 MPH and then doing a full break. Self driving cars might go into the opposite traffic to avoid an accident because they have the reaction time necessary. It would freak the hell out of the passengers but it's actually quite harmless. The car would know the actual risks because it would have talked before to all the cars in the incoming traffic. So it would know which ones would cooperate and which ones it has to avoid. Instead, we need to make sure that drivers can't take control of these systems to "have fun" like racing into ongoing traffic for thrills.
youtube AI Harm Incident 2015-12-08T21:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UghzuMMpBbsZkngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ughhc-RnxMS1LXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjY6HJikXmw-HgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UghIQezVaUOb-3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgicExh_IjSyAXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgjRcySEHSlNsngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugha7FPAvBu3AngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UggqSiIbUqJPI3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UghJ2QECp_kzO3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UghQd27Kawk0s3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]