Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Examples of some good and bad terminators - Adolf Hitler, Mahatma Gandhi, Osama Bin Laden, Donald Trump. So most of these are illiterate good speakers, able to motivate or influence people to follow a path. Such as this, let's say a trained, top notch Engineer or Computer Scientist, fed up with Capitalism, Democracy, Industrialism, or everything in the World. A programmer as this, can make a model justifying wars, deaths, genocide, by beginning from the beginning by feeding only the internet data which motivates these hatred, unruled actions of humans itself, easily justified by the man of today and being enjoyed as well. By the looks of it, AI just need a bad seed, a bad model by a bad person (which the world doesn't lack at any century). It's just an invention which we held until it was possible but when we made and realised it's not a valid thing for lives of others but chose not take it back by all the perks involved. A genius holding a bomb would not hold it for long.
youtube AI Governance 2026-04-01T14:3…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policyban
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugz-Gh575HbOzr5TCyV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy3351DUgmv987PLvl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx2T8cpHtuN0jzRNld4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyIL8Gdd4Uy6eozncl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzz4O64NSipWvcE30x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzcUD7xtnibYbERoJl4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgyTIfl5IxKBmrKsdvx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugy2JH4lJ93V3N9SGzN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxUYpNKS3rya3Fk3AR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwVo_N7raO02SP46MZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"fear"} ]