Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Man why tf doctors tryna use an AI for who needs treatment more? Bitch didn’t yo…
ytc_UgxoKF0HC…
G
The problem is that this can and will be used as a weapon. The new arms race is …
rdc_m27tlq8
G
Soon, none of us will have jobs because AI robots will take them all, and they w…
ytc_UgwBp543e…
G
Even told my chatbot, "Be glad you're not human, we possess a evolutionary curse…
ytc_UgzArGtF6…
G
You know the dude made that robot is banging it too we all know that especially …
ytc_UgyZmZi-f…
G
Does anyone seriously think that AI will not significantly change in the next 50…
ytc_UgwqsrOn8…
G
Compnies probably using small models like llama-8B for cheaper inference. If the…
ytc_Ugy-WNSoK…
G
No dude why don't you tell the truth the most important thing for autonomous tru…
ytc_Ugwc1U7A1…
Comment
This guy understands the training routine better than anyone I've yet heard describe it, but he still has no clue how these things work.
Why do all humans always anthropomorphize everything they don't understand.
ChatGPT does not have experiences. It does not know anything. It is not sentient. Nowhere in this entire video is there a hint of a clue that it is. Just the phrase over and over "we don't know how it works"
Ok, that doesn't make it human. Stop it. It's a very complicated text generation program. Computer programs ARE NOT PEOPLE.
youtube
AI Moral Status
2025-10-30T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwmcqBev0edjnm6cNd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzCQ-iUBGiJbotDjLl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzcX1OUtFKy5HoPyrl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwgbJcYsdnmRb3gSkV4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzIboIypFQV5iJnjel4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxYVeffix9c9QdsB7F4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzqPmTgty0NkHhcLbx4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJxsH41oc6cTXADVR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwUZ0_TvBsN8LCOJ8N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzdAl0JGAMg-NLVBHJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]