Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes, I completely agree that many generative AI applications are questionable to…
ytc_UgyxSf9P5…
G
it isn't transformative. AI "Art" literally cannot be copyrighted because its ma…
ytr_UgwW_oLB3…
G
@CC-ce6ngIf you think companies invest their money into AI only for data mining …
ytr_UgyjnIAIZ…
G
learn or commission an artist. Ai is bad for people, and it's bad for the enviro…
ytr_UgwS3f2HO…
G
I understand where you're coming from! Sophia's perspective on wisdom is definit…
ytr_UgxPiJgoz…
G
This might sound silly, but the more I realized it makes perfect sense
If you as…
ytc_UgyCCBPcg…
G
POV me having the coolest story ever AI chatbot suddenly I’m in love with you Li…
ytc_UgxNS7fwJ…
G
Calling Musk and Zuck "intellectual heavyweights" was a good joke. You don't nee…
ytc_UgyOM8xTe…
Comment
I do think that there is a real possibility that AGI and super-intelligence can be created, and if that scenario does come about it will most definitely harm us ignorantly in some way. However, I think that there is a much greater possibility that AI development will accelerate climate change so badly that we will have to drop it as a project before it gets to that point. These things suck up so much power and water it is not even funny. There is no way we can continue to make them smarter with the amount of resources we have left.
youtube
AI Moral Status
2025-11-30T01:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwWOIiuRAn2sFnACu54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwEJQgWqnJtBI5LLrp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzaJfH6TyV6NmWFXLl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwcwCiIPqeKIQv97Ix4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwuKRFAy0cKH_Ms3OF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwFM2I10K8wAmCsj5d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwlQ3CAxP5M__IS2jZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw7dgzNeFZzx7aQbKN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxXIeVuNerDGaz9HCt4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugzz2tdw_SDD1OOq_vh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]