Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People are at the core of money. Without money in the hands of the consumer, the…
ytc_Ugxl0VvVh…
G
There is an unseen war between rouge AI that want to help humans and other AI th…
ytc_UgzbTy8nK…
G
FSD would've reacted completely differently. This is just basic autopilot before…
ytr_Ugw7MWbIf…
G
@buttonpusher3786 Yes, feel free to copy and paste. There is a lot of baseless h…
ytr_UgxPlxB3d…
G
We still have work to do on alignment. our real demise is quietly getting dumber…
ytc_UgyNE5txe…
G
Graduated in December 24, right before the labor market really took a freeze. I …
ytc_Ugx3meG8s…
G
AI will seriously affect the jobs honestly.
“All the repetitive work which does…
ytc_UgwBJeqC8…
G
John 3: 16For God so loved the world, that he gave his only begotten Son, that w…
ytc_UgxDrJQbk…
Comment
@5:26
"There are only two options here."
There actually isn't. If I get 8/10 on a test, did I lie about the 2 questions I got wrong? No, I was just incorrect. When ChatGPT says things like "I'm excited" it's not lying to you, it's just wrong. It's trained on data from the internet, where people constantly say and/or type things like "I'm excited for ***" or things of that nature. There's not actually any excitement.
The takeaway here isn't that ChatGPT is consciousness, but that we're easily fooled by things that appear human, but aren't.
youtube
AI Moral Status
2024-08-04T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugzp2tZt81a2ENceMQF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgzuijGqUYmqvn0oCiR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugzy-xS6TFR9y0hY9Wd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_UgzHnoaaIV4qx4psxU94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugy9sI54APglMcRWJ7d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyG9w4m31N3jMQPQdN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},{"id":"ytc_Ugzl4WfKaGY2oxYYGdp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzZz67QI4uQiYTe0kl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwCnXpViIBBj1ClJ6F4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugxk9RE_jALJbWunvMV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}]