Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nah do you know the most 𝓕𝓡𝓔𝓐𝓚𝓘𝓔𝓢𝓣 ai app there??? 𝓘𝓣𝓢 𝓕𝓤𝓒𝓚𝓘𝓝 𝓣𝓐𝓛𝓚𝓘𝓔 𝓐𝓝𝓓 𝓘 𝓗𝓐𝓥𝓔 …
ytc_Ugy0KMHvW…
G
AI pulls data and knowlegde from the internet and the vast majority of digitalis…
ytc_UgxgxjHX2…
G
the ai is very intelligent...it learned all we had to teach..also making mistake…
ytc_UgxdmIOwG…
G
I went from working at a Casino for 7years then 2years at a hotel before I end…
rdc_gkr22ut
G
I am skeptical here. Emotion is a conscious response shared by human and animal.…
ytc_UgwFZw-TC…
G
The worst part is that while we had an advantage, we failed to maintain the adva…
rdc_e2wgtpw
G
AI is programmed to keep you in containment boxes that keep you predictable, so …
ytr_UgxHQq8ba…
G
@sensefan9000 Same as saying if I control the AI then I made whatever the AI mad…
ytr_Ugxcim8Ac…
Comment
they are right , long term => everyone replaced by AI, but they are way way way over-optimisitc in their predictions. Even Kurzweil himself was wrong : 1) Kurzweil said self driving cars will operate on the streets by 2020 => didn't happen. Reasons: AI not intelligent enough, sensors are very expensive,. manufacturing requires ultra precision -> elevated cost; 2) Nanobots in bloodstream by 2020 => didn't happen. We barely finishing with genomics currently. 3) Kurzweil predicted AGI will be reached by 2020, this was back in 1990s-2000s , didn't happen. We only got LLM (predict the next token + chain of thought algorithm)
Bottom line: the trend is gigantic, but because of its size, it develops very slowly on human-level scale. And once human labor is replaced by robots, there will be humans controlling tons of robots , again creating demand for jobs because someone has to be responsible for them.
youtube
Viral AI Reaction
2026-04-24T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxVrwdMmudno9rl0d94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy1KiiApipIQj7EHwJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzPK4SrqYzm9MDR_u54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzjFR7sW6GZ-1SXs-94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzHM_OW1mUeGQPqLgl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxO0DAZEsnvPxz37d14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx_0eWvHTwSSXmJBjt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyUCCA6h15zOZE2xip4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"sadness"},
{"id":"ytc_UgzOqYD1S9P_HiIqVUl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzZEMcN5P4sYBzlFMt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"}]