Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai art isnt art at all its just cheap copy of some one's art that ai stole and m…
ytc_Ugzczy9j6…
G
The fact this guy looks and sounds like Emperor Palpatine makes this all the mor…
ytc_Ugx0kQP0d…
G
Artist mentality is just pityfull to me. AI is just a tool they'll never outprod…
ytc_UgzTdIowX…
G
Why DOES Google not care about AI ethics? Is it because AI is being built as a …
ytc_UgxIRRrrq…
G
I here that people are Marrying Animals Now the guy that made that toy robot is …
ytc_Ugwx9E7AV…
G
At 2:39, Yampolskiy really nailed the AI risks. Makes me think of how Pneumatic …
ytc_Ugy0dq8xF…
G
ai art really sucks imo and I hate how many people get fooled by it let alone su…
ytc_UgyCaBaV1…
G
Yes when you're replaced with AI you can just go away and fie for all they care.…
ytc_Ugz-xtR6j…
Comment
I'd like to argue the absurdist point of view:
Are we not just a bunch of elements mashed together into a big sack of flesh and anxiety? What makes lines of code any different than the cells that make up our bodies? A single line of code isn't a whole AI, just as a single cell isn't a whole human. Humans are very complex machines that develop ourselves independently in order to better survive our environment. If we can make a computer that does that, and it develops to the same point as we humans have, is it not conscious? If you say no, you must ask yourself, are humans conscious? We sure think we are, but maybe that's just the method we found to survive in our environment. AI can only be as conscious as it thinks it is. Just as we can only be as conscious as we think we are. So does it really matter? I say no. Live and let live.
youtube
AI Moral Status
2023-09-17T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxLo9dHYBh3uCT6nyp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz5jjAsUTn7ki_Exu94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxSN3exjdgYBiPc0-l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw6G5AiLy2RbMtVVZx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwGg7BnME_ZA_5zBmN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw8MgKiE6vnJeWNWkJ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyajjM7RbgfHLuBy7V4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyO2EiNX3t1rb5Yl3J4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxXjVeA9QnpciWzhLx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyjuzwqWcbw0P5HsMB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}]