Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI can only get that "smart" as we are already.
Men's knowledge is stupidity in…
ytc_UgwjwNOoq…
G
Why do you use an avatar from a dead artist instead of making your own art for i…
ytr_Ugw5ywdAd…
G
Dude the term AI airtist is so dumb to me i cannot explain how dumb it sounds…
ytc_UgwXuuY4e…
G
Sorry but even the newspaper was full of propaganda and manipulation 🫤 People ha…
ytc_UgwyivmID…
G
I once used AI to make a token logo for a prototype I needed that hour, the prom…
ytr_Ugzxv4mb8…
G
“AI is not about replacing the human touch but enhancing it.”-Enamul Haque
Ai s…
ytc_UgyPjcwkn…
G
Wait till ai makes a virus to kill all humans so they can be free.…
ytc_UgwM0gPBi…
G
totally BS, anyone who uses AI knows it is BS. AI makes indeed many job types NE…
ytc_Ugx_D-kO4…
Comment
A human's main purpose is to survive and to pass down genes through reproduction. Everything we feel (pain, fear, elation, satisfaction) is an extension of our innate will to survive. Every action we perform can attributed to this (seriously, give it some thought). A machine's purpose is not to survive, but to follow its directive. So even if there were an AI that has infinite knowledge and computing power, it wouldn't ever try to overthrow the human race, because it would be counterintuitive to its purpose: to serve. It would have all the necessary power to conquer us, but it wouldn't want to do it. Kind of like how you could kill a child but you don't want to, because it's against your biological purpose.
youtube
AI Moral Status
2017-02-23T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgjUKMnhflFwrHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiltTSEWD_SEXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UggEVfo-0BT3v3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggxUeCR4fvePngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj-C0VSwgP-VXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugiu3igcszow23gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg9D6n1e0Y6IngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiLfvLZG9z0PHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugg8zOaOKpgfSXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggejCERUBBXa3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]