Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It looks like there's compression artifacting on the right image. Something abou…
rdc_oi2uqtm
G
when it comes to AI-generated images, it's always the light that gives it away f…
ytc_UgxY1x8KH…
G
At this point I think art is dead, it's not the AI or the artist, but the conc…
ytc_UgwhaqIOe…
G
@cosmicsvids most people don't see a difference it's even hard for me sometimes …
ytr_UgzIhlMEk…
G
Trying to make autonomous cars accountable is laughable because it assumes most …
ytc_UgyvNfMlS…
G
Yeah, the "AI Centipede" model is already a problem for them because it creates …
rdc_kdejybo
G
Someone should mention to these assholes that elections have consequences. That’…
rdc_f508xu5
G
@thewannabecritic7490 I answered by myself in another comment 🤡 This one was jus…
ytr_Ugwr3QMCV…
Comment
I'd love to see a country or someone develop drones, land, air, water, etc. That are very cheap & basically just contain guiding equipment and a small explosive charge. They'll be easy to mass produce & could overwhelm defense systems. Using AI you could teach them to target very specific things like doorways, windows, gun barrels (one explodes a few cm from a cannon barrel on a ship for example & some shrapnel gets inside the barrel), etc. You could also set them to seek out living targets & just suicide bomb them. You could have a mix of larger and smaller drones that all work together to coordinate an attack.
youtube
2020-03-07T22:2…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx3UOaOAlbo8KikDnF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxzuUTUZin-4FbDxsZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwoUztZ1YjE2i8m9gh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxu4HNEfCD7cvLcL8d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzohbX1pe9AEtslGS54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz_pPw7FyzHTPyhekR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx9Bt4Y570CJfxBdkB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxyArdv_xyDytAotKR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwMJ7asi3dK1u_VKs94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxpZ_AeBAj4LnQEhdV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]