Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This is the stupidest way to ask this question. The AI hype machine went full blast as "scale" (and therefore massive investment) became the logic in silicon valley. They needed to justify the 2 trillion in investment in data centres - hence, the "be terrified of my super powerful invention" narrative which gripped media reporting on "AI" in recent years. But even they are now admitting that LLMs and other current AI models will never be "super intelligences" (AGI or ASI). At least not in the way that the question being debated here presumes. Nevertheless, "public intellectuals" like Harari, Zizek and Fry continue to flog their opinions all over the internet and at any event that will host them. They are discussing something they seem to fundamentally misunderstand and hypothesising about a future that is incredibly unlikely.
youtube AI Governance 2025-07-21T06:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgwFZwLtT1p2eGKtEON4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugzmg3Eb2I3PZAeON394AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy3Xe-Zvhu2OJoXHex4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy7YAZ2pX0O2Suh6mt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxhvcsItIdMOxRO-Jt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugz3GdOhDzXwHZQ5TSp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwTM7EwXsmg0AjdfYN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxQ_gKuIf-KUriojwF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzVOv8x8DbaRfHV6iZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwUFP2e04fE-zGe_x54AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"} ]