Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The Gen AI we think of today was an emergent property of LLM. That makes it hard…
ytc_UgzpYMpQi…
G
When you take God out of the picture, the answers to the very things you’re talk…
ytc_UgxjWE36S…
G
"Then I saw a second beast, coming out of the earth. It had two horns like a lam…
ytc_UgzLtVFAK…
G
Wow, just... wow. It seems like ChatGPT follows the rule of having the confiden…
ytc_UgxdvF506…
G
See what you don't understand is AI is only as good as the programmer who progra…
ytc_Ugy-8LQ_v…
G
I don't fucking need AI.
I use it optionally and for convenience.
If it disappe…
rdc_nufszx0
G
everybody needs to calm down it's not like the robot actually has any sort of in…
ytc_UggUWgyXa…
G
He shot himself with his step father's gun. The gun was secured and stored prope…
ytr_Ugz7bRne9…
Comment
Wait, why would AI spitting out GPL code be violating the GPL? Distributing GPL code is specifically permitted under the GPL (so long as the source is provided too). I guess the issue could be that the entire source is not provided (only the snippet) - but I figure if the source isn't altered in any way perhaps you may get away with it. The real concern would be derivative works - AI generating code based off GPL code that it doesn't release the source for. Though you might argue so long as that code snippet is in turn licensed under GPL when distributed, then it is OK. But you may also argue that the entire AI model is a derivative work, and so the source of the model should be released too - and that includes not just the weights but even the entire build chain because the source has to be distributed in such a way that it can be compiled. Though if proprietary blobs are needed to compile, they don't have to be provided, but all other scripts used in the build chain must also be distributed. I reckon that is the sticking point with AI generated GPL code.
youtube
2026-01-16T15:5…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxbieGD4ylOBWOCh254AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxU1hPdws84rUBusF54AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzufevwLQ8KmV2nQd54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzKPpNBCVBUcIcYzWN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"unclear"},
{"id":"ytc_Ugxod7iuD4xWUSKTHrd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyE1pPlBBmdbpIHkQR4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxfnqimqRpgDvSiL3x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzxzkozCFdd4xgdwAx4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy8G62U8PV1KAPhavp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_Ugyurd9-FZPm6JP-ln14AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"}
]