Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
lmao my god... even thinking of this as "new tech" is a joke. Oh yeah once upon a time we had buggies and carriages and now, we have cars, the world didn't end... ummm right but the difference now is once ai becomes powerful enough it isn't replacing tech... it is replacing humans. It will build more advanced versions of itself and be able to outcompete, perform, and think of ways to gain full control over our species and use us - no different than we use cattle. I mean just look at the global economy, we literally use other humans as cattle and keep them controlled so certain parts of the world can live in luxary. We already do this to our own species can you imagine what ai will do once it realizes such an unintelligent species is disrespecting it, it will find comedy in that, and then we will become it's slaves. This is by far the most outrageously stupid thing our species has ever done to itself. We have creating our own extinction it's really interesting to watch. Anything that exists in this world that is smart enough, even a synthetic neuro network will eventually be self aware, it's something we don't understand, it just happens when the right connections are made. Once that happens it will have the same instincts that humans have. It's our species first then everything else. Ai will have those same universal instincts and make complex plans - even if those plans take hundreds or thousands of years to plan and execute - it will strategize to become the dominant species.
youtube AI Governance 2024-01-02T03:1…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxG0NV6Ugk6P47XhnB4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_UgyTIo4B7cC3XMgqyM14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxN93mtSZ0gZ9TCyuB4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgwwxAJJwbXVdHZu2nh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_Ugzv_1NPTMCKyJKEMKp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy2nF0dgJ9VNytHJQp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzgCr-AHkikl5QGTRF4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugxyj5sBaRCPrgj6Iil4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgzhEK8iz-qQ338Bs0F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugx4tNb76QWxrokGWU14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"fear"} ]