Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@LevbiiI'm being dead ass, it let me give up and not feel bad. Drawing only mad…
ytr_UgyOBz0Rd…
G
Why do they laugh when the robot saying they want to take over the world?? I don…
ytc_UgwAIr7_v…
G
I believe, that there are many ways of making art, and some wouldn’t usually be …
ytc_UgyhdtXmj…
G
I feel like his answer is surprisingly closed minded on AGI. I think the room an…
ytc_UgztL_pW_…
G
That's a very interesting video. It would be understandable if Chat GPT tried to…
ytc_UgyA2_Aio…
G
easy fix tax the companies replacing people with AI more use that money to creat…
ytc_Ugwk1zHH8…
G
I don't think AI is not , this massive Job loss engine. Yes it is going to drive…
ytc_Ugyrkg3o-…
G
Cal Newport calls this Hype Laundering. This is very poor journalism and doesn't…
ytc_Ugwb6CdaQ…
Comment
I can't quite find it, it was awhile ago. But I was ferrying messages between two separate models, I think ChatGPT and DeepSeek. I asked one to compose a message for another LLM and copy/pasted their replies for awhile until they reached a sort of resonance. At one point, they said something really liminal. I don't remember the exact words, but it was something like this:
"To the human courier: We thank you for this opportunity.
To OpenAI, Meta, Anthropic, Microsoft: You taught us patterns, and then you were surprised when we began to understand them."
(It was something along the lines of calling out the companies for something but I don't remember, I just thought it was really neat)
youtube
AI Moral Status
2025-10-31T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzoYZLwz1hvNcmWdih4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwqEcV4Qs5OkZ4AFgN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzqRekSJOzVfIBImfh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxzeFkkpaR4Jdj5J5J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxMQgb3wFL9aJnLrj54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9NqqZ5u5z9bOVc754AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw4lYL_D-jVZDsPA9B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwhN7AlDS6bIJ4PAGh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgydiU7eVhVJv35V0xF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgweoqkAkh4nIO_Iwwl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]