Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
if a robot is programmed to do a certain thing, they would do it willingly because it's their function right?
youtube AI Moral Status 2017-02-23T15:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgiFBSdhZ6OiBHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Uggmwyliw6Ndm3gCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgjagJyEa3ihdHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UggaSjYh5W4t03gCoAEC","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgjYo2NEXZe5yngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugj7xHSCB362wngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UggFmovwouz0T3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UggXNbZRYXgRMXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgiOm4edH9tF53gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgiVIEHUHhcKzngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"} ]