Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It should be incredibly illegal for companies to beta test their cars and or safety products on our roads with people. It's DEEPLY unethical and outright dangerous. All new safety system should be rigorously and thoroughly tested, and subsequently certified by independent third parties and government requirements long before they ever make it to the road. It's mind-boggling why lawmakers have it done something about it, but then again super pacs and billionaires clearly have a hand in that. I am fully on board with the idea of fully autonomous vehicles, but only once they've been rigorously tested and certified. No inbetween in my opinion it's either full or nothing, these partially autonomous or quasi autonomous driving modes that manufactures have been adding to cars are equally dangerous in my opinion. Because at the end of the day people are lazy, and they will definitely improperly use one of these quasi autonomous modes. Personally I am not even a fan lane assist or smart cruise etc, that also leads to distraction IMO. The more the car does for the driver if it's not fully autonomous the more the driver is going to likely drive in a distracted manner.
youtube 2026-02-05T23:0… ♥ 3
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyregulate
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzfhubjY3_AeX4uO-d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzLIKTmQh_6EiXlE594AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugx8ACDk69Ug2yakGvl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxA0py79JkdIWey16R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzdBMG8G293u0rAnnN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzQBKBBY1s7m-GjVTB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugw5GhHldVL5tfdO_ux4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy4kTf7o8caZmwrUax4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxqlVGY7FYCg6o-XZp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugw1nKoIB_TzROwZZpp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]