Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I get that the interaction can feel a bit unsettling! AI like Sophia can sometim…
ytr_UgxAHhGeP…
G
we should make ai think humans are cute
i mean look at housecats and dogs and t…
ytc_UgxzpjT02…
G
@Slash_269 lol I never use ai to write or brainstorm. But for beginner writers …
ytr_Ugy4IG4Hw…
G
AI is not art and shall never be art! It’s not even poetic! I am a poetic writer…
ytc_Ugw79ydUV…
G
Okay so why not have an AI create the art and then in your final product change …
ytc_UgwDaErxL…
G
That's is a rip off copy cat of USA robot ball military project bruh 😅…
ytc_UgyNRX6JK…
G
As a true lazy person, AI has not yet reached the quality/lack of effort thresho…
ytc_UgwZgJYNl…
G
> To make a vaccine you not only need to have the right to produce the actual…
rdc_grr44gc
Comment
Hallucinations in LLMs are going to keep decreasing slowly resulting in this kind of AI being able to do well enough what it now can barely do.
Couple of years later some other architecture (likely not genAI or not just genAI) comes around and deliveres large part of what was promised about LLMs.
AGI remains, like fusion energy about two decades away for a while still.
Bigger question for me is, does poppoing LLM/GPU bubble brings down whole economy causing significant recession or does it just kill likes of OAI or Anthropic and hobble Nvidia or Meta.
youtube
AI Responsibility
2025-10-02T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgysIT2spg7TZSSSRjt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwgHWHrfVEigPtgaEt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwU7tJbXuAs94gSFsV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwYBWAXb1zAt5OZKHl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwA6FmI8hwTh-wJQZp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgytAQGU8gISiFDwdcR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw4sA7HiMJ0QZaH7bl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwgg8KvEN5yUMki9_p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwQ1zsgjnlPXfOA_Q94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwQ7-rB3rWfsWZEEWt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}]