Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
But surely artificial intelligence isn't artificial consciousness, so we shouldn…
ytc_UgwAeG0jh…
G
A lot of people are lonely and they are purposely vague to customer service peop…
ytr_Ugxzk3oJX…
G
I feel like it would be easy to get rid of ai, literally get rid of all the data…
ytc_Ugz4JmFPw…
G
"Cutting Edge"... I've literally met 4 strangers in my life that look nearly Ide…
ytc_UgwZC9xD-…
G
You realize thats not themain problem here right? Its the fact that they are usi…
ytr_UgzZloaNd…
G
So all the AI 🤖 bots taking jobs over who will be buying all the products if we …
ytc_UgwGJXSZZ…
G
The problem comes when you start bumping into the grey area of our own understan…
ytc_Ugzry_6Yx…
G
Spot on with the commentary about society and AI taking everyone's jobs. They, t…
ytc_UgwD2LCmJ…
Comment
But that's the thing, Dr. Tyson. History tends to gloss over a lot of things. Those people who've lost their jobs and businesses, not all of them were able to adapt and find something else that gave them the same level of economic stability and prosperity that they had before their lives were disrupted. I hazard most did not and they probably struggled until the day they died.
We know better now. We should know better. We cannot let this happen to humanity, especially our creatives.
Letting AI and those who are pushing it down our throats, win and normalize AI as it is right now, where we are forced to subscribe and pay big tech for the very thing that will harm other people, then we are going to own even less of the world that we own now. Neil's idea of the future expects that everyone can stay ahead of AI by doing things it cannot do, but nothing is stopping big tech from stealing new ideas and feeding it to their AIs and where will that leave us then?
youtube
AI Moral Status
2025-08-10T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwSyrmk1Hv14k0dap54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzvCsfxkSjYgygfH2B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwVJPKbIJUYUf3I4Ol4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgygHJGOkrLuBDwdzrN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxMfUj3wJg5MszdQnl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwWO_aFhK3d-eIwsH14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx8p7YGjp-2o9OiJK54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxbB1KgFpmwm71mont4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx3k-4zgcoM-5qfLvh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz1BIulg81BS8nXNet4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"}
]