Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The point is that these things are being driven forward by 0.0001% of the popula…
ytr_UgyttQf3L…
G
The number of subhumans in this comment section rooting for A.I. is disappointin…
ytc_UgypaiZYU…
G
Great update, as usual, u/lostlifon! In your free newsletter, you posted a tweet…
rdc_jj8f9jl
G
AI makes it possible for fewer people to do more work. That is how it replaces e…
ytc_UgwlWWQgR…
G
"They asked my ai to recite their own work back to them to prove that my ai stor…
ytc_Ugzd9rEK0…
G
@kreskasd5589who are you implying is stuck in the past? Did I not imply we shou…
ytr_UgzvpnQcq…
G
That last sentence of ai that was going to be believer ai; "why? U don't love me…
ytc_Ugx0sDLzj…
G
I don't understand how ppl can fall for chatgpt. I've been an avid user and the …
rdc_my6tlfb
Comment
Humans can neither store verbatim copies of byte sequences nor reproduce them as AI engines do.
If I were to write an AI based search engine that allowed users to locate copyrighted works which are offered for sale, then this would qualify for a "fair use" exception under the law.
However, if I took the same engine and modified it to produce "original" works which incorporated by sequences which were stored in a database (which is what a large language model, or LLM, actually is) for use by a generative AI algorithm, then the "fair use" claim no longer holds up under close examination.
The central issue surrounding "fair use" is intent. Companies which have trained models for the purpose of having AI engines generate output based on the content have every intent of using this data for their own financial gain at the expense of the copyright owners.
youtube
AI Responsibility
2025-04-03T22:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugwhd1s0rPPMd35MAl94AaABAg.AHbHhv7wMlkASt7TbFY5FS","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgzSDg_HjrUHXmEPm6V4AaABAg.AGQLEEvcdmlAHeMEoRLdAd","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugw4t7KrwgJ64STlsWp4AaABAg.AFh1jfpaAE0AGTawvZbsgg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_Ugw4t7KrwgJ64STlsWp4AaABAg.AFh1jfpaAE0AHeINSOvq9o","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_UgyIMwVT69itGj3pW0Z4AaABAg.AEjTRmMD2N5AGTccxlvrg3","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytr_UgyIMwVT69itGj3pW0Z4AaABAg.AEjTRmMD2N5AGod5KC1vMe","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwysebPJrPVIAAn5Vh4AaABAg.AEiYP1MwUvRAGTc3L8yKU0","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytr_UgyUJ1o98exLqWk4qCl4AaABAg.AEi3shogDsfAGTXSk5LrTR","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgyB0f5t2tIXzAgCLrJ4AaABAg.AEi1M8bxBTTAEyPcGMxVGc","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytr_UgyB0f5t2tIXzAgCLrJ4AaABAg.AEi1M8bxBTTAF00mY9qxEG","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"}
]