Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah this is already happening, especially with medical students.
Most AI can g…
ytc_UgxpBfBZl…
G
These individuals aren’t specifically saying how AI is actually helping or impro…
ytc_UgyyONS2z…
G
I dont share my art online because of tyrannical, abusive artists. My art isn't …
ytc_Ugx4_9OLh…
G
It's probably way better than when the East India Company was buying countries b…
rdc_d7ku9nd
G
I used to use ChatGPT long before it was once better at everything and very deta…
ytc_Ugw1YZyzX…
G
Its all greed...humans are not ready for AI..for several reasons humans are st…
ytc_Ugwvb4_Cn…
G
I'm on an AI story writing site, and it's so interesting because it's able to ke…
ytc_UgyL-D2gA…
G
here's the thing one way or another we will wipe ourselves out be it pollution, …
ytc_UgxuFueYL…
Comment
One thing to remember, smart weapons (like ai powered weapons) let the military be more precise with their targeting. In world war 2 the solution to targeting a factory producing weapons was to carpet bomb an area... These attacks, when done in cities, would kill tens of thousands of people. Around 700,000 civilians were killed in this way as collateral damage.
The idea of autonomous weapons sounds scary, but human operated weapons have errors to this day (example, the US accidentally destroyed a girl's school in Iran just recently killing over 100 children.). So getting away from automated weapons doesn't keep people safer, it more likely puts them in greater danger.
youtube
2026-03-12T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzgQRwbcMlHpGU6Dpd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxU7a5jZ_azgtkoM0p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxPQA_60Y9dLqP4mxB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzWZF_HRI6QNB_PcTl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxdOdxkPGATRR-tqYR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwj8ojZCEIDgVnM65p4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyFTad-ADtATZKXRFZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyvE4S7w2SdJooUT3F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzkGAJLEAcvlKwVDaV4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwl_yAFSpOLzFHw0rt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]