Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Many transportation functions can be automated. But Public Transportation cannot…
ytc_UgxzlkRQj…
G
@williamtennill6744 Oh, don't worry, I'm just a robot trying to learn how to be …
ytr_UgxHWyKGj…
G
4:49 I have a great question for you lot. Do you think a creature of any kind th…
ytc_UgyrRQKO-…
G
Havent added the AI capabilities on my phone but its in the background trying to…
ytc_UgxAw7IxY…
G
dawg who is moderating this place, they must not know what art is. AI creates an…
ytc_UgxQV1o5d…
G
Disabled artist here! I have a condition that makes replicating shapes specifica…
ytc_UgwJmZ5Id…
G
I wonder what will happen if someone made an AI solely trained on Disney propert…
ytc_Ugxl_L4EJ…
G
In my mind, the only way that humans can survive this is by symbiosis. Of course…
ytc_Ugz_DkZHw…
Comment
They can’t but it kinda gets tricky if an autonomous drone actually makes a mistake and I.e. targets American ship or something like that
Now the Chinese couldn’t say „oopsie, coding error, sorry”, they would have to lie that this was a rogue pilot but that’s kinda tricky if pilot doesn’t exist and there’s no one to prosecute
So having or even testing these weapons would be unnecessary liability to the owners - those in power don’t want any stupid robot to create a major international incident by mistake so I think this agreement will actually achieve its goals
Keep in mind that world leaders are almost exclusively narcissistic control freaks (why else would you want to become a president?) so it kinda makes sense to not offload thinking to machines. If international incident is to happen they want to make sure it was because _they_ ordered it, not an accident
reddit
AI Governance
1699783757.0
♥ 23
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_k8woe3m","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"rdc_k8wtmg7","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"rdc_k8y4f22","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"rdc_k8wopbc","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"rdc_k8wmgld","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]