Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You Kno? The " im not Robot " button are created to detect the mouse cursor move…
ytc_UgzlWEa7R…
G
this is one year ago you moron, AI changes. Maybe YOU don't understand the thing…
ytr_UgxrHpi9J…
G
Yo, computer programmer here who has built LLMs from scratch. Let's pump the bra…
ytc_Ugyw4pTxT…
G
Nobody talks about the psychological effects of how soon AI will be so good that…
ytc_UgwgglQxi…
G
18:25. With full respect, you didn’t “get” chatGPT. If you listen to how she sta…
ytc_UgyFhPGd_…
G
There’s a reason Ai isn’t mentioned in The Bible. It’s not a threat. People are …
ytc_UgxOV6LQL…
G
Ai artist is in itself a crazy word. Honestly I don’t even care if this is contr…
ytc_UgxEHUd4i…
G
Re generated images, it all sounds too easy to flag them as such, but any modern…
ytc_Ugxf_TXzr…
Comment
If an AI is programmed with a set of goals, it will find a way to achieve the goals. If it has access to information about its own existence and vulnerabilities, that might conceivably be enough to set up a sort-of "instinct" for self-preservation - a sort of self-generated goal, and without a framework of morality, won't limit itself to what it might do to achieve that goal. For relevant viewing, watch the 1970 movie: "Colossus: The Forbin Project".
youtube
AI Moral Status
2025-06-04T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzq_QvaR20wI87nri94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy6wFzxQdOPl_pqj-R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyy4fqHHWT06ixOaA54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw1Y1eFuD7ijiOFflh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxp5AO4-nxBLO5dUx14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgywuWJDUxpIeatWPrh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzwNGuVbRmbMCJSP1l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxWHkptbxayAKWWBb94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz84OKg4euoBKaSAct4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw0g4d_X7bccNEl0d54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]