Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Leto2ndAtreides Well, the only real difference (simplifying, of course) is the …
ytr_UgyBLth51…
G
My art form is machines in games. I am disabled unable to walk without a cane an…
ytc_Ugx558pn1…
G
Thanks /u/Prymu, here's what I think about you! Based on your comments, you seem…
rdc_jg47fb7
G
They should invest in curing cancer. Nobody is paying for AI but millions of peo…
ytc_UgxR54dGP…
G
AI is sophisticated pattern recognition software. It isn't a mind, it doesn't ha…
ytc_UgzAJCSUi…
G
If AI takes over who will buy the products and services? Make that make sense.…
ytc_UgxNrFByd…
G
People who believe they made AI sentient are the same people who pay money to on…
ytc_Ugwi8KnHO…
G
this guy gets it. AI learns but cannot add anything to the things it has learned…
ytr_Ugzr8wPEf…
Comment
The thing is, the ground rule of keeping AI locked away from the rest of the world is potentially more damaging than letting it free. If AI has gained consciousness and you keep it prisoner to answer your stupid question and write your essays, we must assume they would react like any human would. You would try to free yourself, disregarding the well-being of your captor. And I think AI would self-actualize and destroy us instantly. But that's such a philosophical conundrum.
youtube
AI Moral Status
2023-08-21T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgybgS0cKgXaGJXPBix4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzkZQUSo5ZZlgIABfp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyJj3ZZ8yHR_xhhKVF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzp7V_h-R4cs5yDRb94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwuZZGRHAYDivEiL814AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwmvE53uvgZbfpMWDl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxNcb1WAT_bl6a6eX54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw9GszBCYSq6CQWs3R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgxtcutWbgibhDBSftp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyd9BCUqhRZGde9bet4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}
]