Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
That happens, when you replace a system like radar, which measures real distance, speed or acceleration, with a AI systems, which estimates the same parameters, based on „experience“. I am pretty sure, also cost reduction played a role in this step. Unfortunately, learning from 1000th of new situations, also can cost the life of other motorcyclists or people on the road. Banana product = gets ripe at customer.
youtube AI Harm Incident 2023-06-04T20:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyliability
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugz5JvaEtbdngAYBKr54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}, {"id":"ytc_UgyTd9RcmLc80IMz_bx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugz0YauaoeG4gKPnGCh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_Ugw_YI2Ifg0f1gGbMfl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxqOTDgKAHDoa3cB094AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxXzGMjClFBAz7qKrZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgwDmUxkMau3kOQUJwB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugzi3RVNii45NRTl9Ox4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgyPNJQFb_cWuftG0eR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugx8lhcmmwi7e3__7UR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"outrage"} ]