Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
As with any ethical question, it can go both ways. But I want to say that if developing artificial intelligence leads us to the point in human history when we are so unchallenged and comfortable that we lose our existential authenticity/humility as a species, then something is wrong. I mean that if the day comes when all the work is done for us by a.i., the character of the human outlook on the whole will change. That change might be positive to a degree, but as with all instances of progress, something will inevitably get lost in the scuffle. Since the Industrial Revolution, human outlook and values have changed dramatically, and not always for the better. I'm surprised you didn't mention the V.I. and A.I. distinction in Mass Effect. That's pretty interesting stuff, a way to avoid any ethical misgivings some people may have or developing a.i.
youtube 2013-11-09T06:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UggQA9piQKPPHHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugi7lWCqY9ksDngCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgiOrCe094MKjHgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgibYzQAmZAn1ngCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgiWTc5cGlkjXngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UghImfg7p-LkC3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgjrmcSECEPYmngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugid59dtjYGUrXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UggZCNMF-BBN4HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgjGEKM8R0Z5f3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"} ]