Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As a perfectionist, AI is actually encouraging me to post my art that isn't perf…
ytc_UgxHIdUKr…
G
AI does what AI does.
It's the fault of the (human) police officer, he's too dum…
ytc_Ugyjr-s8d…
G
What's the difference if between and AI program doing pieces in a "like style" a…
ytc_UgxA0C_UN…
G
Some days ago I got the chance to talk with an artist who was holding a small ex…
ytc_UgwMClMGs…
G
Ai art is great. Better than regular art. I wont support a regular artist but I …
ytc_UgwzUzZ0f…
G
'What we really need is a world government run by intelligent and thoughtful peo…
ytc_UgxQyVqB7…
G
You can't 'disinvent' AI, it's here to stay for better or for worse. The cat's o…
ytc_UgzuncGtk…
G
I recently was testing out Claude Sonnet 4, operating under the assumption that …
rdc_mxgoh33
Comment
The definition is a BS term IMO. Brimstone (for example) is highly autonomous, can select one of more multiple targets and intercept them without human intervention once engaged, and been in service for 10 years. Many of the algorithms on (something like) Brimstone are more sophisticated than most "AI powered" systems that try and brute force the solution. Modern weapons are just a bag of (usually) widely available sensors that uses those inputs and clever maths to deliver a "payload". When we get "Skynet" we can talk about "autonomous", until then there is nothing actually "autonomous" about them, they're just evolutions on distributed "fire and forget" systems, like the Brimstone, Hellfire and dozens of other similar systems that have been around for a long time.
youtube
2026-03-10T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxXDGygvxhKOLB1hbZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw9Q0xh9YzYo7A6g2Z4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzRVCTp_1wHz7lIJv94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy84HB8qHK2enSEPwx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxGAcKqP9ouh9fvwPF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyIZUJMmUwSwmr3Di14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwNAZESJs6SnlTooi54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzEGHIj3h0QYVumwUd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugy1VMBLGkV6gqrHbW54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzOAQfFmmlEv2YYo6B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}
]