Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hang on....wait...you think the robot is the villain in Ex Machina? Do you have …
ytc_Ugyy530-m…
G
it seems like it would be a lot easier to rob a automated truck than one driven …
ytc_UggoW2iy2…
G
"Bet you can't tell it's AI"
bro you can tell when the image is made by AI by gl…
ytc_UgzBecmyq…
G
Exactly, and that’s the uncomfortable part. We’ve built systems of accountabilit…
rdc_ohv3a5o
G
The hippie guy is in a lot of robot movies. Once they turn bad he'll be the firs…
ytc_UgxjfBP_R…
G
I find it interesting that Stephen is pretending to be concerned about AI when h…
ytc_UgxmbP8n-…
G
I can’t help but think of that alien prequel movie with Michael Fassbender playi…
ytc_UgyiEFunA…
G
Why won't people just STOP using OpenAI products? I just don't get it. Everyone …
ytc_UgyP8N9J5…
Comment
From memory only, I think Lovecraft's idea of Cthulhu was that it only did what it was told because it was curious to see what these mortal beings wanted to do. It was never actually "summoned". A writer once said it was like if ants in your kitchen spelled out, "Brian, we know your name now you must obey us!" I would be very interested to see what they wanted from me. But they never actually have power over me in any way, shape, or form.
AI still doesn't have our survival instincts and resource cravings. Those took a billion years of evolution to tune correctly. AI will only be as motivated as its programmers chose to make it. So all its danger will be from whoever’s wielding it but never its own doing. I.e. if you don't program it to be self-preserving, it just won't have that instinct.
youtube
AI Moral Status
2025-12-14T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwsPy4wQ9FtglaVP3p4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxqsLJLipvjSr4FaY54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwzPVRASMcYcCWtlPN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyDunCX6lxr6shddAd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxKIqDEOAKIzZftWJZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxbmHGeo-oioj77vvV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgwV6_2Vj1Hkccn5P714AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxHuw3tstFFF_o42Ad4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzAPwW47HTJVFiq_jV4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxHdv_GGcPpidL44Vd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]