Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Get a degree in machine learning, or get into IT. Someone needs to manage AI, de…
ytc_Ugzhysqy_…
G
The one who controls AI will own humanity. Just social media could mess up what'…
ytc_UgydDeGpJ…
G
No AI is ever going to spend days riding its bike past its first crush's house t…
ytc_Ugw5VOJ1L…
G
95% of facebook and x accounts are just AI bots as most humans got deleted in th…
ytc_UgyjkR32j…
G
Haha so the car drives itself but you have to sit there tapping the screen and w…
ytc_UgzNqa2pu…
G
Military has this advanced ai already that's what they show in Terminator but no…
ytr_Ugxk1Qua2…
G
One of the ways you can stop the SkyNet scenario is recognizing that it doesn't …
ytc_Ugy1BRIGK…
G
Schools are useless ai is the future you teachers only created worst people tim…
ytr_UgzoF-gFn…
Comment
The people claiming that AI training is "just the same as" human artists taking inspiration don't really care about, and haven't likely even seriously thought about, whether it actually is the same or not. They're just making up rationalizations to justify doing something that they personally want to do, without caring about how it affects anyone else. But here's the thing: That's exactly the sort of rationalization that _sociopaths_ do, too.
"AI is ethically no different from human artists"
"Sleeping with your girlfriend is ethically no different than sleeping with my own girlfriend."
"Shoplifting is ethically no different than buying things"
"Lighting a building on fire is ethically no different than starting a campfire."
"Killing a person is ethically no different than squashing a bug."
These are all fundamentally the same argument, and it's a _very_ slippery slope...
youtube
Viral AI Reaction
2025-04-03T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzAAzfVxUgSRDAc9-R4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxwWkp9AI5b3IbfUWJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyRe8hQvbXmr092HZp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyhASdSoA-IK2jCWEh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwpQNk66F4HAt-Lx7V4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz_LmAiUP5fKx0ztYt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzEC4rDpV0_Y2FEi9R4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz1Cfu2lNE46lrAOZJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxVi-hbpMrLAFzf3B14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgwHN6NZ_xA2VCqRkox4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]