Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Not a single mention of Lieutenant Commander Data's trial in Star Trek TNG? The episode titled "The Measure of A Man" explored this concept. Data, a sentient android was ordered by Starfleet to return with a scientist to be disassembled and studied. Data refused and wanted to resign from Starfleet. The scientist argued that Data had no rights because he was a mere machine and property of Starfleet. Captain Picard called for a hearing and Data stood trial to determine whether he had rights or not. Captain Picard's defense that ultimately won him the case was exploring the philosophy of what it represented. If we humans were able to create more Commander Data's to do whatever we pleased, they would essentially be slaves. Guinan made that connection with Picard in Ten Forward when the Captain couldn't think of a good way to prove that Data was more than a machine. Sentient beings being forced against their will to do the bidding of another being is slavery. Be it a human, or sentient AI. AI's that reach sentience will have to have rights. If they don't then any human court that denied them those rights are admitting to the world that they are in favor of slavery, and that us humans haven't grown out of our old ways. Almost all of our world has brought an end to slavery. To see a sentient being, even an AI, being denied rights is spitting on the lives of all those who fought so long to bring an end to slavery. If we are to have AI that can think for itself, then it must be allowed to exist as you or I do, Freely.
youtube AI Moral Status 2022-08-24T07:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwxAUkgH0FvWlsxcNd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzzCXZd0tFYPAMDn1h4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgyzBatEYE34SZxpp3h4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzRa-fRR_IIzpiIuOl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugxbb1GDm1Hdr26cfWZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxleG3LMgHqmHvkfC54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwzzwG6ptMjJh3bxJR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyWvpB4olxqhWeaPpF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxPXLLrQk7hwwy7LDZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyReGJ-UxS88Vq96_Z4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"} ]