Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Take a pencil away from pencil artist he will use pen take away ai from ai art …
ytr_Ugy5UrPZg…
G
E.T,s say the biggest problem in the universe is A.I. From what I,v seen its eas…
ytc_UgxuQS6OB…
G
The biggest concern for me is that AI art as a concept destroys the purpose of a…
ytc_UgxlN9XZC…
G
Nice cope but AI can handle perspective and depth of field just fine. Any flaws …
ytc_UgwQ1yztI…
G
Considering the number of people who drive every day, it isn't hard to imagine t…
ytr_UghLzE_Fk…
G
Google doesn't have a policy against sentient AI, it has a policy against suppor…
ytc_UgzQm8l_1…
G
Nice video! Well laid dout facts there too. I'm thinking of venturing into imagi…
ytc_UgwRaqRVV…
G
But robot created by human 😍
Why robot is very genuine than others human haahah…
ytc_UgykreLJ0…
Comment
I'm starting to believe that whatever possible benefit a truly thinking, feeling, and experiencing AI could possibly provide to humanity is far outweighed by our almost inevitable demise from it. In the interest of our survival as a species, or at the very least as a cohesive civilization (as pointed out, it wouldn't even take a conscious AI to completely fuck up our entire social structure), AI development really should be halted until we know what the fuck we are doing.
youtube
AI Moral Status
2023-08-21T07:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgySZ6aLxO7ZpreByjx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhpURSR2IJEDSpv494AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyWgr1V5d5bs3tppft4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwhkKosGzX3vt7JSYR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx6l9iUT3XAEriWuNF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzK6rJckH_Tb0w0wqJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzQ2GXzis34278cFMZ4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzPTds8zGVirYTk1hx4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxotep83lhNTvUs1cF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKJBMcpWtO68-y1qV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]