Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
the new Turing trap, Imitation Game, copy and replicate human work, replace ? Brynjolfsson argues that by obsessing over the Imitation Game, humanity fell into a trap. If you spend billions of dollars building a machine that does exactly what a human does - just faster and cheaper- you haven't actually expanded what humanity can achieve. You have simply built a labor substitute. The only way to avoid the Turing Trap is to stop trying to win the Imitation Game. Instead of building AI to mimic humans, Brynjolfsson and other economists argue we need to build AI to augment humans. We should be designing systems that do things the human brain fundamentally cannot do like instantly analyzing a billion chemical compounds to cure a disease, or predicting complex weather patterns to prevent disasters. If AI is a tool that extends human reach, rather than a rival that copies human output, then human labor becomes more valuable, not less. We remain the orchestrators, and the economic benefits are shared rather than hoarded. It loops right back to the idea of Economic Human Alignment: if we let AI just replace us to save a few bucks, we crash the economy. If we use it to elevate us, it could fund that global UBI we are talking about.
youtube AI Moral Status 2026-03-01T04:0…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningcontractualist
Policyregulate
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgxMUKhvT2axm6zay-d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgwYT7SqIQREwO_nf9J4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugy55HRusgII_JqtZ9B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwmYA3w_Qs59dN_y654AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyzAkfq4ft_jyJa2t54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzOl62x8kwtxGngEU54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugx7tiWXl895cx2tfgZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytc_Ugw9WPHHmrgMbtn9UnV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxmDxH5b-VFniKDwFt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugx3Q27nZJB4-FBWHrp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"} ]