Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The only positive of generative AI (in realms of art) I have found is jumpstarti…
ytc_UgxaG6sa3…
G
“AI accidentally made me believe in a human soul, because it showed me what art …
ytc_UgwONTnHI…
G
Lol, if you've actually used AI in your real world job, you'll realise how overb…
ytc_UgzoUxBJp…
G
There is nothing more ironic that one of the MAIN things every human on Earth ca…
ytc_Ugygw4Kaa…
G
There is a flaw in his gorilla reasoning - gorillas are not extinct. And he assu…
ytc_Ugx6Z3yJc…
G
Going to be polite with AI, thinking it has emotions, really ? That AI si going …
ytc_UgwM7wUij…
G
Then stop building them and that'll work so these demons don't take over the wor…
ytc_UgyCGJsr3…
G
Of course NOW MAGA wants to regulate corporations. If you don't want AI in your …
ytc_UgwiYRrB5…
Comment
While AI will certainly have lasting impact, from a technical point of view, it is currently un-delivering massively in key areas. There are already problems with scaling the LLM models further, data result quality and security. Also AI operation is hugely costly and mostly not profitable. Actually multiple technologies are classified as AI, not just LLMs. By 2027 the AI bubble may be bursting, but that remains to be seen. Characters like Musk and Altman are sales people with a track record of sketchy, outlandish claims. "Our product is so good it will destroy society" has a grain of truth, blown out of proportion for marketing. We should stop putting these loudmouths on a pedestal and tax them properly.
I'm just afraid that if 2025 taught us anything: just because something is idiotic and doomed to fail, it does not mean that people won't attempt it.
youtube
AI Jobs
2025-10-24T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy7Qp4J0FJqhXVyALJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyAlJL59n5hLUazRuZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzNoEghalDfIhr3Bhp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwoPij75HAAxEVcy1t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyPlIRiSXATdaqt6ON4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxUC_ZE_TdylD1mLrl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzBzdSTchba-kbTNYN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyInr6JDd0DekDyfcZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw2fQ6MKWdUDy4N5SR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx341QZlrELVCUgycd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]