Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I will say maybe on this video.. If I was the inventers of A.I. tech, I would be more worried about hackers and Computer virus, then Shady software.. I have a huge Question that has been bugging me for a long time, Why does it seem like every time some cool tech comes a long, Why can't we have it for helping American society, then making fucking war weapons of mass destruction.. every time there is something like a good piece of tech, it is instantly use to kill people in another country, instead of helping to make peoples lives better? I mean we have Fucking good holograms tech and what does people do with it? Use it to help kill people in a war. We could have saved the fucking movie industry, and to help people be entertain or help people communicate across the states.. Its like every fucking tech that is new, some fucking weapons manufacture instantly turns that tech (that can help people), into a fucking killing machine.. This is turning into a real issue with America, they don't want to take cool new tech and use it to help people, they rather use this cool new tech to fucking kill people. what do our government and corporation do with brand new tech,  Kill, Kill, kill, kill,kill, kill, kill... Its so fucking sad.
youtube 2015-07-31T01:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UghyligspsInLngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgiR781rAxGYn3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugj5te92XVT4QXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UggsHT54JBbk6ngCoAEC","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugju1o0fz003UHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgjIsz0jcDy5yngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgieaIBYq1XbdngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgibTCMNgh18engCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugh6qLpEz9mISngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugis0cWW2lpi3ngCoAEC","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"} ]