Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Basically drop the notion of 'I' in 'AI' - the driver fell into the false sense "Oh the car is equivalent of a second 'me' ". One of those odd situations where a cluster of events conspire to end in a fatality. In the long run autonomous controls should be a 'back seat' driver not the one in front. It should always be the driver who is responsible - and more so on these test runs.
youtube AI Harm Incident 2018-03-22T13:0…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxzxBdh4ic0G0rVpBt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzH4QZI5SK5ZDudo5N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}, {"id":"ytc_UgzeR2MAoXpzAeoJuSt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxaPEnnVPVJKXCaAH94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzENXYaM8mBbW0okPB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzTx0_sm-U6lOTFXgZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzEyiLflYKd8fQR_zh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyclY637vkvrbuvSPF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwCv5H0mcTPDuELTat4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyGFiRB7ETNFOrLDy54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]