Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Would AI survive solar event 10x bigger than the Carrington event? We are due fo…
ytc_Ugz4dC_p3…
G
ethics. compassion. empathy. good and evil. selfishness versus selflessness. sac…
ytc_Ugz7xCZMA…
G
you shouldn't be able to sell Ai art! Just let it be out in the Wind and not mat…
ytc_Ugw_o7VQu…
G
Disabled artist here 🩷
My art is the thing that kept me housed and fed during t…
ytc_UgyC9YWsX…
G
He did it for the fame lol. People be like what’s it like to get knocked out by …
ytc_Ugzq8qRIX…
G
AI should be built and programmed based on the Prime Directive. Human Safety Fir…
ytc_UgxKBVcLA…
G
Very similar argument was made against Nuclear power. It costed us millions of l…
ytc_Ugyi9ZyCr…
G
People who comment like that just gotta admit that they're lazy. They don't want…
ytc_Ugy0Y6fb8…
Comment
Siri in human form. Ask a question and it will give you an answer. I suspect that he ideas limited to information they can search on the Internet or a closed network,. We all know that the Internet is full of crap most of the time….. how do you truly know if the junk is making decisions based on accurate information? The question may be asked “how do you get rid of termites?”… someone jokingly says, “burn the house down”.
After spending millions of dollars and thousands of hours of research, they will find out just how hard it is to be human…. And they still won’t come close. I think this is a real bad idea and to push it only tightens the noose around humanity’s neck. anything that has a computer chip or operates wirelessly can be hacked or controlled. AI is a bad idea for people and humanity.
Computers just can’t contemplate their own existence…. They have no sense of “being”. Nothing else other they know nothing else but only one has been programmed.
youtube
AI Moral Status
2023-05-01T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyzhdYv7mLbng_9_Cd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzfrk4311Hctp_ZC3V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzrb5IJh-6Cbm6P4iF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyED5mdfM8th7kKCbp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxpb2nyydFai6o7dql4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxir1iNMIk36sChn5t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz52nU4FmG_VBQPIa54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx33PdwFjwvTf99zrR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzQzCtoIbXplITVjj14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzCaNYTn_ciASTaWHp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]