Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Oh word you can't pre test some things before unleashing them into the field? and yet we are relying on this technology, that can't be pre tested, for medical research, military action, law enforcement, and general social interaction, just rolling with this issue as if it was something acceptable to bear with? 54:30 Eric Schmidt: "it now looks like if you have enough land, the solar and battery backups are more cost effective on a present value basis than natural gas" That's very convenient and idyllic, if it wasn't because lobbyist are working hard to not be regulated, there's a good reason why the anti datacenter movement is gaining so much momentum, a movement that directly opposes Eric Schmidt's current job, and it is because the energy, land, and water consumption of datacenters and their upscale versions AI factories, just displaces people from their home by lowering real state value, by destroying the environment around them, by depleting the water sources, poisoning the air with their gas ran energy solution, spikes up taxes while diverting funding from public services and infrastructure towards the finance of the datacenters who then receive tax exemptions, it increases the cost of public services like energy and water, and completely violates any laws regarding public noise levels, so it generates sound pollution too, while generating little to no new jobs for the now receding local communities, because the whole thing is designed to be automated beyond the initial construction And after all that damage is done, people fall ill, impoverished, dead, and have to sell their houses and leave their home Eric has the audacity of saying "oh look at all that newly on sale cheap land free to grab at a discount!", NOW we have enough land to switch into a solar solution, that is more efficient, and will let us claim being an environmentally friendly industry, of course you need enough land first, so the displacing, and killing of the people that lived on that land, is by design, and part of the process, but there he is shamelessly talking about it as if it was an advancement and victory of honest research and technological development 1:00:17 Nate Soares... it is ok to say that Neil asked the wrong guy, that way he could have passed the question to another speaker, but instead you just went on a ramble about AI making AI to go rouge and kill humanity, and how you want people to focus on that, while ignoring the more imminent issue of mass unemployment The problem of mass unemployment and a global financial crisis, clearly doesn't concerns you nor worries you, but instead of telling people to ignore the incoming issues, just say that you are not prepared nor know how to tackle the question, rather than make an irresponsible and ignorant call for everyone else to also ignore current societal issues 1:15:36 Palantir disagrees , not only is the reasoning of their AI good enough for deployment and active decision making, it can even do pre crime 1:32:00 wow Eric, i wonder why Neil would want to avoid featuring a book coauthored by notorious war criminal and repeat human rights violator Henry Kissinger, but thank you for having such little social awareness and bringing forth your allegiances and ties, yes your words and those of Kissinger will dissuade any worries people have about AI making their lives worse in the future, it will dissuade any worries about the USA still having a democracy despite project 2025 which wants to make Trump president for life going unobstructed, and worries about AI becoming part of a regulated industry, despite the fact that lobbyist are pushing towards the complete opposite
youtube AI Governance 2026-03-21T19:5… ♥ 7
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzfWgaCLFlWhETbLvZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyO0pQUeZQgMiHQDSF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgzqxQGcBg5ofpQ32-p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwjtXYuqeEc9U33NgV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzbGTgkQL2qq0nJsi54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwoZ_gPcP6tyyey6wN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugxfvc-La-z-JRMIwHt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwIYkAfjFhaX_Jfta54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzWH3EJxG40yVWbjip4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgytSlHacpOoS0CNMsF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"} ]