Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
so here's a fallacy: lots of people *had* to die before we made cars/planes/etc. safer -- thus, releasing dangerous products for public consumption and fixing 'bugs' as in response to deaths is a good strategy for this new dangerous product;; holy fuck, stop socializing the costs (unless you're going to socialize the profits) -- we don't need AI until it is unlikely to ruin our lives/kill people/cause death;; just gtafohffs
youtube AI Jobs 2025-11-19T19:0…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyliability
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugz7AqhqAZNCC4_v4KN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwUZ67etWZHdUa1q1x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwxk_7TUQkXU-17M3t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwckXWnQogs_rJuza14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwYrylvqNDYRb4ZhUl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwxyaERSSVoRvngihx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugwyk_U6H55zAPACFZl4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy_3rBbf-EVTlP5afx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugx6G5QoUNQM-jYEyGR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyYbaCx_DJYvzdg3lt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]