Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I hope this never happens because when one of these automated cars makes a mistake and kills someone, who takes the responsibility? The owner? They'll blame the software company. The company will shrug their shoulders and say shit happens and lobby for legislation that shields them from responsibility. Humans aren't perfect but when they make a mistake they are held responsible.
youtube AI Jobs 2016-12-27T03:2… ♥ 2
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugg0EV0tker_cXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiiJukA2BGiAXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugiah2c14h-uD3gCoAEC","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UggX0gkKg3YWUngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UggF4SgKJkgFO3gCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ughj703Fl-4v1HgCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgjAOocpQuyPbngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UggGmX4M8SgVn3gCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgjnzyeHptnHBHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UghyjEnZxI719XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"} ]