Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Here is what I think : First the car should drive in a safe distance for this kind of scenario, this can be applied to a lot of others potential scenario, lets be honest, a lot of accident happen because people are too close from each others and have no time to stop, but with electronical brain that can react in a split of a second the opposite can also be true, driving ultra close which in bonus will reduce air resistance, the car will brake instantaneously in case of problem and of course it forbid heavier vehicle to tailgate lighter one (no one want to be sandwich between two heavy trucks, self driving or not), scenario of object falling off trucks can be compensate by avoiding to tailgate or even follow vehicle prone to get object falling out and maybe having a heavy truck behind rather than a car, every trucks that have an electronic certificate that prove they have a safe cargo will be the "targets" for tailgating self (and safe) driving car. And it will also reduce traffic congestion and pollution even more. But even if this scenario occur, the thing will probably be to have our vehicle avoiding others, which mean bumping into the heavy object, because after all you buy this car and you agree that it could happen ! But with a safe distance and the instant braking ability, this car might anyway save the life of the "driver" unlike what would happen if a regular person was driving cause of the delay in the brain that is reduced to almost no delay at all for computer. Also don't forget that computer drive better, try with most sport car to disable all stability feature and you will understand how good they are to master trajectory and stability, a computer can avoid an object without bumping into someone else cause he could know the sweet spot in acceleration and braking to get an escape trajectory safe as possible. Anyway we should NEVER programs car to arm someone in any way, the owner is responsible of his car even if its self driving, because what about an empty self driving car that someone just strap a heavy bag under the belt to fool the system into thinking someone is inside, in this case it become clear that self driving car should not look for self preservation, of course bumping into another truck or its trailer could be envisaged, cause a truck or its cargo will only endure property damage with relatively low chance of harming its driver, but in case of road accident no one else should be involved, anyway those inside the self driving car will still have greater survival chance without taking the risk to arm someone else thanks to computer superiority...
youtube AI Harm Incident 2015-12-09T13:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_Ugi5b5pbaFA4-HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ughno_FgymJ6c3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Uggj7mDHrma5v3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UggNqFSJ4vIgCngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UggrI2xcSyTYu3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgiaKBAMwPkUFngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugj26mPeq39upXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UghpeMaaKGI5aHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UggWaE2jszLulngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UghT2utfCq_hLXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]