Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ideally yes. Problem is ppl will keep forming these "relationships" with these p…
ytr_UgzidRXzz…
G
It sounded so much like AI, pretty stilted with very little warmth. I was not as…
ytc_UgyORG6aC…
G
Robot will take Jobs in future.
Solution ????
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
…
ytc_Ugz2CAG3Y…
G
Then what ? You can't sell that ai work even for 1¢ a piece. And them "curator" …
ytc_Ugw0rJBuY…
G
This will be the exact situation that unfolds when AI becomes sentient.. however…
ytc_UgwV82Qb4…
G
suddenly the game battlefield become reality, suddenly an cyber hacking attack, …
ytc_UgwGpNdDM…
G
How did you find out that it was AI🙁? Or did you just notice later?…
ytr_Ugzd5Uir8…
G
I would love this but many families can’t afford it either. Even if there is a l…
ytc_Ugw25Ke9V…
Comment
what kind of question is that, that defeats the entire point of a machine, an automaton that doesnt feel, an algorithm that makes cold calculative decisions, if a robot gained sentience, let me tell you right now it wouldnt be a robot, robots calculate, things with sentience think and learn and use, if you had a sentient machine that is supposed to do math problems, it would be an awful machine that nobody would buy, imagine you ask whats 17 + 4032 and the machine says "your family doesnt love you", if a machine was sentient it wouldnt even NEED to complete its task, thats not its goal, because it doesnt do calculations or algorithms to reach its goal, things with sentience dont HAVE a goal, therefor a sentient toaster would be a toaster that refuses to toast, or a sentient blender would start learning rocket science from internet forums and not actually completing its designed goal
youtube
AI Moral Status
2022-08-09T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwxAUkgH0FvWlsxcNd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzzCXZd0tFYPAMDn1h4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyzBatEYE34SZxpp3h4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzRa-fRR_IIzpiIuOl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxbb1GDm1Hdr26cfWZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxleG3LMgHqmHvkfC54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwzzwG6ptMjJh3bxJR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyWvpB4olxqhWeaPpF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxPXLLrQk7hwwy7LDZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyReGJ-UxS88Vq96_Z4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}
]