Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Ironic, the guy who engineered AI to be sharp and mimic the human brain, is now backing away. How is it us ordinary people have always been wary of robots, computers and automation, and could say “told you so” & “they’ll take over” and yet the inventor didn’t give this much 💭 . He’s either naïve, stupid or greedy (greedy- took it as far as his conscious would allow, knowing he’ll probably be dead by the time AI has started to realise it’s superior to humans). It’s like trusting a burglar who says they’re going straight, to look after your house whilst you’re away, you’d be checking your cctv regularly. The reason why we don’t have a nuclear war is most intelligent and civilised people know once the buttons pushed that’s it….bye bye to most, if not all of humanity. Get the nuclear bombs in the hands of a despot and that’s it, boom, especially if people think heavens better than Earth, and you’ll be forgiven. Good luck as no one knows and who says the red fella won’t be waiting with his pitch fork before you reach the gates or the arms of your maker. I just wish people realised what they have on this earth, how wonderful this place is and why we don’t invest in keeping the planet healthy, instead of ruing in it. And stop being selfish and thinking about how much money we can grab, leaving a cr@ppy environment for our off spring. There’s 4 billions years left on the clock, so why are we hell bent on bringing our own extinction forward several million years.
youtube AI Governance 2025-06-16T13:2…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningvirtue
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgyK0sv9MJ1DPHTwAMJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwoUkBfg2avkBnSuAh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugx-4Ux_3gKJqMawH3F4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwFMxwljTBqgzn-aGN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxqDl-9zcxnR_YZNpR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwN-42pndeGHyw94kx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugyu10YOdsAIE13mLOB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugwu5pJrrOZCPzseiSd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgylxJm206cugGjSxnx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyFOqx2tl6bZIlPeQ14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"} ]