Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I do think that there is a real possibility that AGI and super-intelligence can be created, and if that scenario does come about it will most definitely harm us ignorantly in some way. However, I think that there is a much greater possibility that AI development will accelerate climate change so badly that we will have to drop it as a project before it gets to that point. These things suck up so much power and water it is not even funny. There is no way we can continue to make them smarter with the amount of resources we have left.
youtube AI Moral Status 2025-11-30T01:3… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwWOIiuRAn2sFnACu54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgwEJQgWqnJtBI5LLrp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgzaJfH6TyV6NmWFXLl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwcwCiIPqeKIQv97Ix4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwuKRFAy0cKH_Ms3OF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgwFM2I10K8wAmCsj5d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwlQ3CAxP5M__IS2jZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugw7dgzNeFZzx7aQbKN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxXIeVuNerDGaz9HCt4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugzz2tdw_SDD1OOq_vh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]