Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
One thing to remember, smart weapons (like ai powered weapons) let the military be more precise with their targeting. In world war 2 the solution to targeting a factory producing weapons was to carpet bomb an area... These attacks, when done in cities, would kill tens of thousands of people. Around 700,000 civilians were killed in this way as collateral damage. The idea of autonomous weapons sounds scary, but human operated weapons have errors to this day (example, the US accidentally destroyed a girl's school in Iran just recently killing over 100 children.). So getting away from automated weapons doesn't keep people safer, it more likely puts them in greater danger.
youtube 2026-03-12T03:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyindustry_self
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzgQRwbcMlHpGU6Dpd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxU7a5jZ_azgtkoM0p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxPQA_60Y9dLqP4mxB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzWZF_HRI6QNB_PcTl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgxdOdxkPGATRR-tqYR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugwj8ojZCEIDgVnM65p4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyFTad-ADtATZKXRFZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyvE4S7w2SdJooUT3F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzkGAJLEAcvlKwVDaV4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugwl_yAFSpOLzFHw0rt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]