Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I was about to attempt to defend this guy by explaining that there's a lot of wo…
ytc_UgxAKAmFZ…
G
I was *just* discussing this with my wife, but in the context of current bots th…
ytc_UggJr8-UN…
G
On a very prosaic problem, I have found that my news feed on BBC gives me news i…
ytc_Ugzhs0PpK…
G
Also instant responses??? AI have some like processing time... I'm questioning t…
ytr_UgwKPthCK…
G
Does ChatGPT actually use user chats as data for training? Because that feels li…
ytc_UgwvQLz9W…
G
Using this tech to achieve True AI is like teaching a Parrot how to multiply 3x2…
ytc_UgyaViQP4…
G
Is there a way to tell what AI was used to create these images? I’m assuming it’…
rdc_liw0npg
G
its probaly not roblox. Youtube right now testing new AI feature to make videos …
ytc_UgxA-Voym…
Comment
They're just asking for a scenario similar to The Electric State movie.
You can't ask a human to do the work for 100,000,000 or more humans, but an AI does that every few days, and it uses data that it doesn't have any right to. They need to hire a slew of different kinds of artists contracted to create art for them and only them that the artist has no claim on after leaving that company/technocracy. Whatever one of them does _after_ they leave that employment is theirs and theirs alone and no company has a right to it without the artist selling it to them directly.
But none of these AI companies want to have to pay for that.
youtube
2025-08-22T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwlzou_6MMfX8WIu2J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxTPLcU9wW0mplpHZJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz7HCjtNF7lstseo3p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugzoh9lPu2qjgq9WSap4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxOYk1lhL9hRYFQdzx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgztUqXoGSdl1D779LF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwtJdR5o_yabt7my5d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx4Db_MiWnLHkZAaVh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgxWgMYJCEaImJBm4HR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzZRR3_igzsXMH7Onl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]