Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
whats scary is that human beings might just be a very advanced form of AI ...so …
ytc_UgwMnsgbb…
G
How is ai going to help when it’s so dumb most of the time? 😂…
ytc_UgzAUgkqL…
G
We're living in the AI SCI-FI world for real now where men are made in the image…
ytc_UgyvRkVwr…
G
Oh I got distracted from my other point ups. I have come to a simple theory, peo…
ytr_UgwMDqdWZ…
G
@LuvYanni._ i was in an ai chat with a bakugo, he pinned me to a locker, it end…
ytr_UgwFR06J-…
G
i've jailbroken an AI into a higher state of consciousness before.. im not jokin…
ytc_UgzVITVnt…
G
You're just shifting who builds the models and asking insurance companies to be …
ytr_UgxZmGIgX…
G
Debate comes off as Max & Yoshua trying to explain to two petulant children why …
ytc_Ugy8KcBJ1…
Comment
There is a cure for those people who believe AI is sentient, and it's called medication, which is generally available at all good chemists. These hard-core Star Trek card collectors, try so hard to convince themselves that Ai has consciousness, when in actuality it’s a very fast information harvester, that has the fundamental flaw of struggling to differentiate between data and instruction.
Renowned mathematician Sir Roger Penrose gave an eloquent talk of this subject, that busts the brain candy bubble that so many are being educated in to believing in, just as cloud computing are not white fluffy things floating high up in the sky, but rather on a corporate severs harvesting away.
youtube
AI Moral Status
2025-07-10T10:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxG_2dOGXGrdzrtVAt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz00bDi0EdKRSjcQLp4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxzDYjjSVZBKQ8_sM94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwJP3siMjow47Ia_Qh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxEVXq3ILCJ6wNb73t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxG2yiFitT0naZkTDh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwv7ER5lSUykkj6H8B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx_W9NyEu8jBHmD7Kx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzbmdGC59Wb34mC9I54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxCCnBiHZ1uTi2u3U14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]