Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
why would anyone be dumb enough to give an in animate object virtual feelings it still wouldn't be real c3po and r2d2 had personalities but their memories were wiped between episodes 3 and 4 so they were brand new all over again is that "violating" the "rights" of a robot who cares about robo- machines that's like programing cars and laptops to have personalities and they say whether u can go somewhere or search something up on the internet what purpose does that serve anyone other than making things more difficult that's like giving all of the battle droids and clone troopers from starwars the chance to have their own opinion and make their own decisions which is foolish a clone trooper and a battle droid were made to take orders and wage war just like how a toaster or a microwave take orders to prepare food just the idea is stupid that's like giving trees rights trees may be alive but they don't do anything that would imply us to give them rights people who hurt animals for fun aren't accepted by other people anyways the butchers who chop up cows for food um WE NEED FOOD however they violate the cows rights in the way they kill them which is slow and painful to the cow which deserves a quick respected death because we are using its dead body as food and I know that I wouldn't want me dead body to be eaten so if I'm gonna be used for food I would at least like a respectable death like a beheading making AI is a waist of time and people should be consent with machines that need human operators
youtube AI Moral Status 2017-02-24T03:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgiHxUzYsGI4e3gCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgijvnN8rxT23XgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgiVw7y25qwdLXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgjXq74qnn4w1HgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UggQpIKTMpgtzngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UghqIPRZeJxYD3gCoAEC","responsibility":"ai_itself","reasoning":"contractualist","policy":"ban","emotion":"fear"}, {"id":"ytc_UggHKar-b2b8k3gCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UghY0ZAiPP5dD3gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UghbRTz2I1HUJHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UghmIkWSpY9XpXgCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"} ]