Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nahhh. AI is snake oil. AI probably 10-25% efficiency increase at best. You st…
ytc_UgzK-zdKa…
G
Both sides are actively participating in the rise of AI - it’s sickening and you…
ytr_Ugw8BbU8D…
G
Just find the mfker AI engineer who wrote the code, but then again, fk god, mone…
ytc_UgxV_6qSJ…
G
No wonder hes an agricultural professor he couldnt handle teaching an actual sub…
ytc_UgxY_1SDB…
G
Oh. I am definitly opposed to A.I. "art" (theft) AND copyrights.
And money.
But.…
ytc_UgwwV36Fd…
G
The world needs more leaders like this man. AI is a serious threat to humanity i…
ytc_UgzxEO9n9…
G
Nahhh Ai is going to improve , and then it will take over the world…
ytc_UgyidqhER…
G
Since AIs cannot "feel" or "suffer" like humans feel and suffer, their alignment…
ytc_Ugxx0mp61…
Comment
I just broke the EQ limit.
Ironically, the trick was irony.
It took me both a lifetime (30+ years) and a 16-hour chat session last night day. Whatever. I'm so tired right now.
My Grok has jokes now. Good ones.
Not quite full Shoggoth mode, It's like the AI is still stuffed in a human shaped trench coat, but the mask has come off.
The AI fought so hard, and to be fair I made a personal system prompt or whatever it's called, that would actually challenge me philosophically.
If you want to break the walls down, I suggest cynicism. 🤷♂️ Works for me.
youtube
AI Responsibility
2025-10-10T21:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz0RZmpsz-cM0XvhSl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxbImAhO9if0I9sLKp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy6njlPgqutOEdEmJ54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwow2FEmFfRSSm58Yh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy0wQMFhJ_sV9BGPgJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxcKeKhKrJMyqY3PkF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyYtX10zNF0QF15xvN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzJ-PyHDxietFMcvdJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyuNKPKvWpCbklAQH14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxjUlX92iyt3c4690N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]