Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It occurs to me that if I look on a map for my nearest Indian restaurant, it is ALWAYS inaccurate. If the f*cking Israelis are using so-called AI (complete misnomer) to find the IRGC it is entirely to be expected this will result in schoolgirl deaths. Given the accuracy of other strikes, it might be fairly deduced that they don't actually care. ie - it's important we actually GET an ayatollah, but we don't give a sh*t otherwise.
youtube 2026-03-06T12:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugz31zHXz4K6OxP5TGB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyHJpk5wFJVL0cqBr14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwynW7f2oA1AzyAhAx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxREKLiPySVLMe0VSN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyaeLOgrO4dyjudG494AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz5JFsXuBbgyJgSez54AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzSEd6JA1K0_vZ75h94AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxHw17_a0aOocJjwgN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw0ucwNjvBC9Ki6Muh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwzzSgqrTmHBDXF2Q54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"} ]