Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Legislation that prevents the USA from biological experiments exists, they simply do it in countries that don't have the same legislation instead. Even if individual states legislate against A.I. and machine learning in certain fields, nothing would stop federal level usage if these technologies created too much of an economic imbalance in, say, China. Example: China uses A.I. to create a mega virus targeting vital U.S. infrastructure. Humans can't compete with the processing speed of A.I. leaving the U.S. vulnerable. The only method to counter the mega virus is to create an A.I. to fight it. Pandoras box is opened. I can't remember the name of the book, it was made in to an inferior television series, starring Josh Hartnet. In this book an A.I. is created to play on the financial markets. It creates so much wealth and interferes with its creators life so much so that it becomes dangerous. The creator attempts to switch the A.I. off, however the A.I. had foreseen this possibility so it started covertly redirecting small amounts of its generated funds (in comparison to the wealth it generated) to create its own server farm and infrastructure at a secret location. It uploaded itself to that server farm and buried itself so thoroughly in the world wide web that there was no way to remove it without total collapse of all connected infrastructure. There are many examples in science fiction of what could go wrong with A.I. and none of them fully realise the possible dangers of a true A.I. that is fully connected to the modern infrastructure we use today. Skynet, Ultron, Ava from Ex Machina, Sonny from iRobot.
youtube AI Governance 2024-02-29T20:0…
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgxB574rBySRm4oTMQF4AaABAg.9zfgKG2RkJ2A0Xo5qVUnZe","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxB574rBySRm4oTMQF4AaABAg.9zfgKG2RkJ2A0Xsn4t8Asc","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyBKWBhr2qTMpLG9SB4AaABAg.9zeFzPEiHIeA0PwDKoJadN","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugx2mYOj26vPVHW4fMh4AaABAg.9zd4kg7Mz2u9zzPAnRBAcF","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyEOVXYwXrVt5GaOtB4AaABAg.9zcxSIoKIcfA9Lum0IMUTK","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytr_UgzA79vbzrOJDzYIH2d4AaABAg.9zbpryFxWUsA-JixUqZ0GS","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytr_Ugw4RZK-W3wfUA2VZSJ4AaABAg.9zbdk8QeDRQA-kzrHHcdkO","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytr_UgwU8wjYrsaJ_1xotXd4AaABAg.9zbYUZP1OzpA0XpX4sJ7x4","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytr_Ugx9HE811khRGDUbCTp4AaABAg.9zbOhZzwGs29zrlrIavz98","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgzoiqacwDnTsirvXtl4AaABAg.9zb5LdYWvsR9zm38Fa02mw","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]