Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Really good video today and I couldn't agree more. Amazon can't even get a delivery cart to work properly. And that hurts no one when it loses it. I guess it's really annoying though. LOL One of these things marched right through multiple police barricades in an active shooter situation. I'm not making this up. And while I find it hysterical, the police didn't. Can you imagine? ROFL. Amazon shut down the whole division and apparently has given up on the whole idea. Obviously, this was just one of many problems. Kind of like their delivery drones idea. But, I can easily see how the Tesla mistook bike tail lights for car tail lights at a distance. I've done that when it's at night and the bike is on the horizon, a car is in the distance, and both the motorcycle and car tail lights matched up. Had a close call once. I can easily see how my close call would be a crash and a catastrophic failure for a computer. That's why you can't trust them. I love how it shuts down 1 second before impact to try to skirt liability, but it hasn't helped. Tesla might just get sued out of business over all the lawsuits this fake auto-pilot is causing. And Tesla actively promotes the system as a true auto-pilot when they try to sell it to you. I guess it costs around 15 grand extra. And for 15 grand? Yes, it should be able to at least drive the car. But always remember, if you have to plug in scenarios to "teach" it, you're NOT teaching it, you're programming it and that's NOT AI. That's not intelligence of any sort. AI is, and always has been a pipe dream and plots for Sci Fi movies. Just like "Mr. Fusion."
youtube AI Harm Incident 2022-10-08T18:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyban
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_Ugxn6fIVcOPdKFNfmyN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgwijjM_CmEBeRL5oWZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyZo4YsKITx4N8XFkh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},{"id":"ytc_UgxjmhhqgdruTWgJTxN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgylJFKJN5_qVD_nrlB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgwHY3KQCA-LV9U9fip4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"mixed"},{"id":"ytc_UgybQLh7iFLMBWxReml4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_UgxaYGWmCxrIdHCP9rh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgxUMCQxGL2ak5MWvWJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugy5c-j3VyDUSAMR0W94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}]