Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@nickolasolds315Did they have videos of AI robots in the 70s? No.
But we hav…
ytr_UgyKPaaae…
G
I feel like I need to point this out in the spirit of honest conversation: Compa…
ytc_Ugys2EPCj…
G
That robot has the shittiest aim on the planet. A moderatly skilled human could …
ytc_UgxQanAWy…
G
This technic is amazing but the same time very dangerous because with that it's …
ytc_UgwRbNvwA…
G
I can personally understand AI for personal use like getting npcs for simple D&D…
ytc_UgwLxRPPb…
G
Driverless trucks taking jobs is like the washing machine taking jobs. It's the …
ytc_UgxPOm8K4…
G
Hey @HumanOddity69, thanks for your comment! You're right, this advanced model i…
ytr_Ugx5l8TWn…
G
LLMs have plenty of agency, they shouldn't because they are just next token pred…
ytc_UgxRW_Imx…
Comment
The issue I see with AI is it has to learn. In order to learn, something has tell tell it when it is right or wrong. Someone has to tell it that all this data you are seeing about chemtrails is nonsense so don't include it when you process your inputs. As far as it writing code, code that works isn't the same as good code.
youtube
AI Moral Status
2025-07-24T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzH6TXipICLYs9pgFp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzyWf18CvHO95gfAoR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwX5YcFnlSjRPk1Vap4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxCHXgRxtpPUg6cd994AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwMUMBop0KNA46o58N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugywzy_0LtEhRErC4Lt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwsUjs54L74Xgnvgxx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxwELpb3zk4KZ5kEjJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyIB2DOo1JeJsdFHKF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzaBNGj1b-H78DO7AZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]