Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
He is so talking out of his ass, to make you hear what you wanna hear. Sam is one of the leading companies who are pursuing super intelligence at extremely reckless rates. When he says we need a world treaty to agree on the fact that no one can develop super intelligence anytime soon. We can only focus on narrow AI at the moment then I will start believing he has good intentions and isn’t just putting on a face to please, the people who are educated enough to know the real wrists of what’s going on right now. And it is not hard to educate yourself on this, just look up some podcast interviews on YouTube from the leading world researchers in AI development that have quit the development of AI and started to pursue safety awareness for AI instead of working on it with everyone else because they know we do not have a solution to alignment over anyway to shut this down. If it gets super intelligent, the brightest minds that have worked on this I’ve literally said it is impossible at this moment.
youtube AI Governance 2025-11-25T13:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyregulate
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgyoazZ8wh7_4rHbrf54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugx8G9uP71XJPanjMJt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzHD_1Velcyb0KWLT54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyULRjvaIUvVges9Bx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwhryNMeLw6vjIzUgZ4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwRGPqIvjmObHYrJ1t4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgxPM73qiSV9QIf0ACB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgycYD5hf9vwiUcjPs94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzaJsL47Sh-v-Zf1zV4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugy0M0L6wNxea6wAfbB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"} ]