Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I say we make one robot with the emotional potential to need rights and then stop there, just so we can prove we can do it. Producing more would be a disaster as we barely have the resources to care for ourselves, so if we start producing factory loads of rights demanding robots then we will all collapse due to the new mass increase in demand for things like power.
youtube AI Moral Status 2017-02-24T01:5…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UghlGXyQaNsIXngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugh3566kEkH8cngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugiw7wyfDqb7DXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugh_Jnzba6zPMngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgjTD1xyiwqHLngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiOCFMcZTm5j3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgjGbkL2h-4AkXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugg4D6Jf2shlKXgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_Ugh5cHol1AeHWHgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UghIJiW-jZtI7HgCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"} ]