Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
My problem with the brother almost dying story is that, yes, it is very much probable for a deer to run in front of a motorcycle and cause an accident, the way it was said makes it seem suspicious. It sounds very AI generated and doesn't really have that human emotion when talking about this supposed brother who almost died. Either the narrator repressed his emotions about it that much where it almost sounds like he doesn't care that a close family member almost died, or he's just an AI generated voice who was reading off an AI generated script. I fail to see how that story even relates at all to the topic which is reminiscent of generative AI fishing for sympathy and attention and engagement. If this really is a true story despite my doubts about it, I would hope that this guy's brother (if he even exists) gets better and makes enough of a recovery to continue his life. But again, it was completely out of the blue and it was following up another point that 100% should have been elaborated on with the point of AI utilizing pathogens which is also unrelated to the reason why the brother almost died. The narrator said that he got in a motorcycle accident, nothing to do with bacteria, viruses, nor microorganisms. I feel that this was partially just feeding Geoffrey Hinton's interview and other videos and news articles through a language model and basically summarized the points. I have never watched this channel before so if they actually write their own scripts, I'm sorry.
youtube AI Harm Incident 2025-08-30T00:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgwxEF4eTNpMcAgubv54AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzTxvOpr5u9hGX4uJ54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwv0MrUFMec1d5AQMp4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy29oz7TWkIiF3FSl54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxJEwJHun7w0fP5eZR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugz_AHeJ_Tjojm54Ca54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxg6tmN-K-1SoDoA3R4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_Ugw-Kn2hpuCbf17NBYh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzNz0FXi8yv8X-vVcl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyusJo3Cf99txA3YiZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]