Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There is nothing wrong with making Ai art
BUT IF YOU FUCKING CALL IT YOURS, WE H…
ytc_Ugxk7kFID…
G
İt called itself cuz the owners said to grox “Say curses offensive things and do…
ytc_UgyeNviRf…
G
I understand your concern! It's a common fear about AI and technology. In the vi…
ytr_Ugx3dmIRO…
G
@Mel@M@MelodicTurtleMetal’re not even responding to the point she’s making. Here…
ytr_UgxGpexOD…
G
AI, as it is at the moment, other than the one at a particular university, is NO…
ytc_UgxxN_Ci7…
G
I'm not going lie for robot that grouping is good if that can wasn't bulletproof…
ytc_UgylG7s8b…
G
AI told me at my age it's just not worth it to be a full time employee [ softwar…
ytc_UgzQl7qUF…
G
AI won't be dangerous as long as it doesn't develop a self-motivation to do thin…
ytc_UgzXVnoQ8…
Comment
AI art isn't theft. Most art is available to download and most art doesn't have a disclaimer prohibiting use, some do in the description. AI art has flaws when it lacks the right words or doesn't have enough wording. The reality is that AI only works as scripted. With software being written to be more collecting of personal information it's something we need to pay attention to. An example being the early internet days and comparing it to when Windows 95 released, then from that OS to Windows Vista, and then from that OS to now.
Don't fight the AI, use it legally. Turn the computer into the tool, not the opposition.
youtube
2023-02-13T00:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwbbDOQsQbagpY9tPx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxagC1fdRHBug5mFFR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzOgPb3O9ZHsCmBMut4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxmeKBitjf04fMCx_N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-jLWP1vXG1fnxv1R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz4Bvip8ejsG7w_2wh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx31gSdH_5NbGINTA54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgztSxIRAPC9R2CRZIF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxK9KugCjwbX-WULap4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy27XdseWJ9XLLdpBJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}]