Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We dont need ai, we want ai, we dont need robots to do the small things we want …
ytc_UgwNaGchL…
G
I wish ai art could be cool and an artist aid and ethical but when company what …
ytc_UgyIIzvvI…
G
“If we have time we should think about the feelings of the Ai…” I think we shoul…
ytc_Ugz_Qpr_2…
G
Seems like it's how they are in the Asian country. Which it seems good, because …
ytc_Ugz5o4Gv6…
G
I will never understand why we are so hellbent on replacing ourselves! Ironicall…
ytc_UgxJLgZ2p…
G
I feel like the Internet and things like Wikipedia were the boogyman two decades…
ytc_Ugw526S74…
G
I love this podcast. Makes one really think hard about AI and AGI. Things I neve…
ytc_UgyNx-lF9…
G
I thought that the end reveal was going to be that the Kristen Bell interview wa…
ytc_Ugw8Qq4zV…
Comment
I'm all for AI and want to see it improve as a tool, but I'm also against the unfair and unethical practices described in this video.
I suspect that the vitriolic and accusative natures of some of the responses from AI users are simply a way for them to try to justify their own profit from the current system, or lack thereof. These people know they won't get away with this once regulations come, and so they try to delay it as much as possible by beating down artists.
Such behavior isn't something new nor exclusive to artists. Beating down or discrediting those you abuse or steal from is as old as art itself in human history.
youtube
Viral AI Reaction
2022-12-25T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzz-YWNNPW0q_zXBqd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy9Ue1PJEE_pBx5Pt94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwhOSRLuYWH6utBw4N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwthpM1oogyaCWRctx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"sadness"},
{"id":"ytc_Ugyny9gxst_tVmc1XFB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxpDg81WM8_PCwojxd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzOrPgKZ466-_PE2tt4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzjaVIrlBVbJoBKp_d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxk7RZ7NiNKfSA5e554AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzS5f1gTrrYgol-5wp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]