Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
11:00 I am not engineer, but that's sounds so dumb. They just need to detect any object in the path and treat it as an obstacle. It should not matter, what that object actually is. It's not about categorizing things, it's about not hitting anything. It feels like this could be hard coded, no need for AI. People and animals too close to the car should also be easy to detect with simple proximity sensors. Are you telling me, a car can warn me from getting closer and closer to any object when going in reverse, but AI cars can't tell? This is all so weird.
youtube 2026-03-29T17:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionoutrage
Coded at2026-04-26T19:39:26.816318
Raw LLM Response
[ {"id":"ytc_UgxbWKEvMvQKTyFYOkl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz-qeTASm3A5-Zb2F54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxl53Ut8dwqi4we8_t4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyUHNOlVIVCzgCeif14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyiRrJ5JV0qlf9tSU54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]