Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
So, we can't have self driving cars that could save hundreds of lives and just make life easier in general, because when an accident does happen, we can't figure out who to blame... We humans have a blame fetish. If there is a problem... someone has to be blamed.
youtube AI Harm Incident 2016-03-26T00:2…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningcontractualist
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwYshySr1KKtQlUrWh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwm77fbkqFuJ6JdfCZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugwn0pu2DCy34x1c1md4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwfwoAS2VTAqnTId1F4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugh5_R7ugVcnAXgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UghK9JWzzYfksHgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"approval"}, {"id":"ytc_UggrhJ60UdmN_3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgivsdFhEaHtTngCoAEC","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzkrTgLLBZw79vHxyl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugw5GVwik-fYBdaqPQh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"} ]