Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You can put a constraint on the self-driving car that it is not allowed to follow so close to a vehicle that it cannot stop in time should something fall off the back of it. That would avoid the posited decision entirely.
youtube AI Harm Incident 2015-12-08T16:2… ♥ 100
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugib-iFxInRg4ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgzFUQ8GPbt9yNaOdT54AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"liability","emotion":"approval"}, {"id":"ytc_UgxzSftSN_wFULNqL1p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwD9iQ649sEtw3M6St4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzM2AJixsFVY7jTAzx4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugyt3bmP7CoVO_VJ2MJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgyCk20oDNePJZ_rk3t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyW_HWJCoDz2SPktxp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxigiyuOS1GAZLcicJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugxx6U2krVj_dYW15ZV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"} ]