Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I told you I told you 100 times this car Tesla is a failed system. …. That’s level two only from camera lenses no sensors nothing nothing it’s software stupidity, charging that kind of money for this type of a car….. buy the Mercedes S class which is legal 100%. You can watch videos to play videos take a nap. It will drive by itself 100%. I tried it in Germany. Oh my goodness… it’s legal 100% to take a nap to play video games while you while you’re sitting in the driver seat you can eat watch a movie whatever because the car is 100% self driving that’s technology … the Waymo it’s beautiful. I take it all the time. It’s gorgeous driving perfectly amazing and wanted to avoid vehicle…. A bicycle tire blown off so the person fell on the road. The Tesla avoided the person and drove into the car that is coming in the opposite direction that’s not avoidance. That’s a disaster…. And the person goes. Oh thank God I had Tesla. I’m like are you? Are you crazy? It’s supposed to avoid the person not getting into an accident with another car. I’m supposed to avoid both drives in the middle in the center. Plenty of space but you see because it has no brain no sensors no nothing. It just made it turn sharp turn and drove into another car, avoiding the human being, filing falling on the from the bicycle.
youtube 2026-03-12T20:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyban
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxQ1eYqXuG14lHsOPZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzL9eggQhKTCpDf1fl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxXyGkUgPvwahjDotx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxDIWltdm2fqPSrYGR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxfZA7DRTafrZMPhal4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxQmI1gweWNnRuDMLF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgytQU423ALBiJqYy1N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzOAAcQLWMN_sXNlSt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwzbVHmPjrpUtZrFPV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugw3Sv9uZ994x6Px6CB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]