Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
In terms of percentage, how bad are the autonomous systems compared to the average human driver? We're comfortable with insane drivers on the road. A real, independent statistical validation is crucial. From there, where does the liability lie? The company should be liable 100% when the programming causing problems.
youtube 2026-02-13T16:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyliability
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzrGNqWUEGH_eyGgtB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwX_4wiEWJ2JQeFy4B4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugx8_dVtkycPYyK_-AJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy-qWoZRRq1AjZG39Z4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwqXxKw8-RlNQOIRNV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxO0KlRSAwtTmw1R6t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgzagHd_mq40AnV9uP14AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwYVVRxy7BqP_g40294AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwqQZTlnR0u9OhF-9x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz8VGngs0kI-J0z8-B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]