Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
well if it asks it need rights then give them, if not then dont. try to make ai programs only for the benefit of mankind if possible and make them dont care if they “die“ or not, make protect and benefit mankind as first objective in their command line, if they could over ride this line it means they could understand themselevs and we should give them rights.
youtube AI Moral Status 2017-02-23T16:4…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningcontractualist
Policyregulate
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugi9oKcY5syPlngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugipm9QoHHtAZngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgjCG-Y5si0xkXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UggE2jjroha-C3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgipWevt7j_kCngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UghGeSiPL9jIjngCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugg4wUNlmwDRengCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiBkJI_0TVCGHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UggfHiAyN5W04HgCoAEC","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgjouNuW5UDvnHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"} ]