Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
And the best part is: If the AI hallucinates, like it has done so many times. It will be the lawyer that signed the paper that gets the blame. Because it's the lawyer that says "I read and confirmed everything in this document is true" when they sign it. The AI is free to reffer to made up cases, or make false sitations of real cases, as much as it wants.
youtube Cross-Cultural 2026-03-31T14:1…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policyliability
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwD1_dJOuBwXLsbzWR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxGk-KAXtqZzkuHYsR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzofA_iVC6H8kVNgpV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzFnbwkvdf5XhX1o7F4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugz3cRoOOT5XdMM0PbJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgycyKFvVWxiSBsBQOZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyERgfKpvVyi4o67B54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzM48nRuxgcWv8QpvN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyNwwgF3BanqpqF-0d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgxZKO3JENHImtcJr5B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"} ]