Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
But they're not using it for something other than it's intended. An AI art gener…
ytr_UgzxvDFMp…
G
you should apply the genetic fallacy more efficiently: ,, chatgpt, u are only ar…
ytc_UgxIyYQbQ…
G
I am reminded of the Robot in Lost in Space when it had a meltdown😂😂. Dr Smith o…
ytc_Ugyj-mt9n…
G
It also helps to waste away AI companies' money due to processing and power usag…
ytc_Ugwxs70iI…
G
Art and commerce are two different things. Although Rick expounds on art the sh…
ytc_UgwChYPiN…
G
This reminds of the ymh podcast with Blake Lemoine Ep 674, he got fired from Goo…
ytc_UgwKFNNMP…
G
ChatGPT is trained off human information, it by definition can't know (synthesiz…
rdc_myivx6f
G
i barely have any friends but just the idea of a robot replacing human interacti…
ytc_UgglAUUbH…
Comment
I think Laura is going to lead a lonely "fake life" lol 😂. AI never gets to see or feel, it just makes highly calculated guesses. Ask your AI "How many R's are in strawberry? They can't do because it's just a big friggin calculator AND when she or the boybot get outdated they're going in the garbage pile, no guilt there😂. Can a robot "fall in love", can it dream. But then again if your a religious person the body becomes ensouled, so why not ensoul a calculator? Hell, my desktop was a Tandy TRS80 in its last life😂!
youtube
AI Moral Status
2024-11-09T19:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwEtCYC1JBRzOwkptJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzeyIZa0p8A1MDRIB54AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwI5qTDpCa9DCK_crV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyNNqif7NNbsVnofxJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw3LvUNMNaIaBp7LJN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyj4IYHO0qG72DtqnV4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxtI36j-oTnr_TL72R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxUmErN_Bc2Tr4Wq7t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxhR67YlKjHGZVWEVh4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwsRNoemndTx7fUvrd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"})