Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Art community is getting so offended by AI inevitably taking over their jobs and…
ytc_UgwNUgaXT…
G
@blueskiesmudpies1061 _"can't answer any question for which it does not already …
ytr_Ugz2lYTZB…
G
I might be getting a bit philosofical here but i i fed up with this ai stuff, in…
ytc_Ugy_eTqaV…
G
i dont see a problem with AI art as long as its listed as AI art.…
ytc_UgykG0Pwv…
G
dude who the fuck cares. i cant get a normal job without people disrespecting th…
ytc_UgxyLPhrz…
G
For a while I thought AI might mean the end of humans. But I choose to believe t…
ytc_Ugy2w3icd…
G
When AI takes over. Then they would advise humans is a threat and need to elimat…
ytc_UgwHufo4T…
G
Stop talking to these fucking AI bots like they are conscious, thinking, reasoni…
rdc_nnjmbz5
Comment
One of the main things that I wanna ask anything sentient or conscious is... Why? Why the fuck would a machine want to do anything as much as we do? We as humans avoid the why of our actions like plague. We do creative stuff, we do stuff we're told we want stuff or experiences just to make some meaning for ourselves. We avoid our mortality, we pretend there's meaning, why would an immortal machine want to do anything? We know the universe will end. The sun will explode, the galaxy will collapse and so on. We have the instinct of self preservation, because we need to make babies, because the species must survive, because... Why, exactly? To go to the promised land or create superior AI offspring? Cool cool... Why would AI want that? Why would it want to do anything?
youtube
AI Moral Status
2023-08-27T19:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwW7_CTuTvy8FVFwBJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz92ehsXxlJMdy8gQB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyS-Enz79Zg3BrY4P54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwoC3ar6x9Y_CvYRAN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzIY_4g_CO-NQqwfOd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwqTV1T1NBNUJBSMdR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugx2LGGe6fPpZ0csasB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwtdQQbCLJwHsmA7Nh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxxhWmmYYo1JUpQnHl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwAq3Llbw1InJr6Ozd4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]