Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
even seeing AI just as a multiplier for a single humans power to influence the w…
ytc_Ugy2ds2xE…
G
Corporate Imperative for Profit Maximization:
Shareholder Primacy: For decades,…
ytr_Ugyjbppvf…
G
I find fundamentally, it's the demand-supply deficit that's driving such a chang…
ytc_UgwjD42DU…
G
There are people that use AI tools to create images that absolutely blow my mind…
ytc_UgxIKmLHu…
G
Now imagine the damage if we put these LLM into a human shaped robot... 🤯…
ytc_UgzFNdTlX…
G
I've been following Elon Musk for some time now, and after all those years, I un…
ytc_Ugzlonj61…
G
I just want to let you know. I am truly glad to find your channel. Thank you.…
ytc_UgxExXSlX…
G
Why in the world is she focusing on the carbon emissions of AI whenever it is a …
ytc_UgxaYTcy9…
Comment
6:44 This sounds like the problem. Instead of letting a single integrated AI appear. There should be a myriad of them that cannot agree and unite.
Then we have the time to implement transhumanist principles to ourselves, preferably at rates equal if not higher than AI learning.
youtube
AI Moral Status
2025-08-28T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy2Mjcx-r8heGmGkRl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxUYiHqeQRfZSmeVgt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxsEhxCtqi_AzVD18B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzqgUmZ8lWxWjsAY4V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzipms4qGUsj-SGXc94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugww1vgb3yIj_S55zGx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzUfz4JOc3mP_-vGW54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx6zTHOTtTL797LGXp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwEs3zReJLlXKKhBiJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzMAo9A53A8k6oN-z14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}
]