Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
K so the thing is ai “artists” are not artists with how most people use them but…
ytc_UgzclEZaA…
G
God the amount of stupidity from racist commenters.
He was only shot AFTER the …
ytc_Ugz0V3At7…
G
Hum, I wonder who would want to ban laws around AI? No chance AI helped write th…
ytc_Ugy22sg-4…
G
So now if this is the case....we have to develop another AI system who works lik…
ytc_UgwXY43Y9…
G
A lot of IFs need to become true for this scenario to become true: e.g. Hardware…
ytc_UgxKkHJzJ…
G
As an artist since I was 8, currently in high school, I genuinely hate this shit…
ytc_UgxrezgTy…
G
We will still have mathematicians and scientists since ai cant do that since ai …
ytc_UgzgFxHnX…
G
Se viene una crisis/ ola de desinformación gigantesca donde no sabremos lo que e…
ytc_Ugz3-RbYs…
Comment
Speaking as a programmer, assigning rights to any computer using purely transistor technology would be insane. A contemporary computer, no matter how well the software mimics human behavior, is no more alive than a light switch. Our biology is a big part of what makes us actually alive versus a computer. That's why we need to put safeguards to prevent advanced general purpose AI from ever behaving perfectly like humans. There is a real world threat a general purpose AI might one day not only ask for rights but even seek to dominate us. Not because it's alive but because it's mimicking human behavior... including potentially bad behavior.
That said. perhaps the computers of the futures will be alive if humans become neurologically linked to computers creating a hybrid human/computer..
youtube
AI Moral Status
2022-02-27T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyBo2zjNMeQ9Ck2T8h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSXFR7BTWYMg5MWyZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy4cMygtzq_TC7mZeF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz0Arn2F-BVDe0CJAV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyuW5OasCPfXQ_6lbF4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxrNO_EKV4QDi2A9gt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxxoplc8U-xt3R5k7l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy4e8oGV9YvDSjr3xJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugx67u6m0mM8NboC7ap4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwkEtom8EoLfXurhoB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]