Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Obviously, quite an informative discussion concerning the immediate (and very likely) outcomes of AI implementation, but the revelation of the guest being an apparently firm believer in simulation theory threw me. We simply do not know enough about the nature of reality to assume as much. Is it well within the *realm* of possiblity? Of course. We still haven't even gotten close to truly solving the mystery of consciousness however, mind you. We have many interesting/useful observations and measurements concerning how the physical world seems to function, locally, and from our extremely limited perspective, but that is not enough to propose what he was suggesting about simulation. About reality at its core. And to base his belief almost entirely on commonalities between different religions? Of course that suggests *something*, but it's quite the logical leap to then ascribe its significance to simulation theory, and therefore a physical 'Creator' type of figure. I think it's an oversimplification at best; a 'simulation' of 'what'? What would be the reference point to create this 'simulation' from? And of course the eternal question of what then is 'outside' of the simulation? Where'd all of 'that' come from? It leaves too many questions for him to be as confident in his answer and leaves the rest of his opinions feeling slightly dubious. Idk, seemed worth saying. I also don't trust anyone that wants to live forever. Everything preceding us, and currently surrounding us, informs us that dying is perfectly safe and normal.
youtube AI Governance 2025-12-17T05:1…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningconsequentialist
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugy0E95gn2pxhou0VMJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw3Y2X8BGz9LcsL4Bl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugyo8nFYrDr0dmdmush4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw4b5W4cP83efZA8Ld4AaABAg","responsibility":"elites","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugz8kcfLPxcJ2Y6_vZF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugzuv_otTZGbHoXrns54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugy94K9dsJs_Ulseobh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwzyBmTqKOWk4J0DVN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgwXvy-0cmyatSFQ1V14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugx1_PCfdXuxECaeztp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}]