Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Yes, but it's all flawed. It's based on the idea that cameras are enough, and no other sensors are needed. It's wishful thinking, and the engineers are clearly too scared to speak up. My boss has the same mindset as Elon, never accepting we are the experts and ignoring our very real concerns, just to see what we told him would happen, happen. As a machine learning engineer, cameras don't see depth. It's pattern recognition in a 2d plan, and accidents follow no pattern. They absolutely random in nature, and no 2 accidents will look alike to cameras. With a lack of depth sensing, and accidents creating a random image, cameras are not, and will never be enough. Even if it's perfect 99.9% of the time, you need redundancy. The new boeing planes crashed because they relied on 1 pitotube for data to decide if it should automatically push the nose down. You can't have computers making decisions off of 1 input of data. I am very interested in what tesla is doing, and they have the potential to be leaders given they are collect almost all of the data. But cameras only is a flawed design logic, and as an engineer, you can't let the CEO tell you do do things that are not physically possible.
youtube AI Harm Incident 2024-12-14T05:3… ♥ 10
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgxRBGLpDfwGGHwfmnh4AaABAg.AC0UJWzPiG7AC0_ogY4QSt","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgytQW9BLcbyDBIS2NN4AaABAg.AC0POaey3ymAC20gcQdjQU","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgySl15Pb7mu_PZfk1R4AaABAg.AC0NJ5WjjtaAC1tWN7J-mi","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgySl15Pb7mu_PZfk1R4AaABAg.AC0NJ5WjjtaAC3oAQs0EY7","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxAbZ4ZHehzWNIUECB4AaABAg.AC07c7wPMkoAC0XZ7Etoxr","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_Ugx0N4YnAhy58Ub84jZ4AaABAg.AC06NFac4-LAC2Aomapbdj","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgxsB-Fm0MxUMcYCSah4AaABAg.AC021rOkvQqAC14oDh25H2","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_Ugyd2NaGb_7XVQpjeWl4AaABAg.AC01dge9mHGAC08Byjr4Rm","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugxno93GCTvXvZth3GZ4AaABAg.AC-zPliLvA7AC0AJgMTu8O","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugy2jc6StZwBvUDddN94AaABAg.AC-xr5sQ6f7AC0AVBuPdfq","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]