Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My god people will just hate on whatever is in trend
Most of the ppl here don't …
ytc_UgxpTOVvO…
G
That is it. Maybe he focus too much on AI, and he forgot how to be a human…
ytr_Ugzud191Q…
G
Overwhelmingly, these issues exist primarily because we collectively employ capi…
ytc_Ugx4FFSo3…
G
The whole deep fake, AI generation, and fair use situations all need to be broug…
ytc_UgyMHJmX9…
G
The other side of the coin is that AI, lacking in emotion but smarter than human…
ytc_Ugz1EmBKG…
G
Remember guys, if an AI "Artist" says their stuff is better.. well, I think y'al…
ytr_UgwYgebIG…
G
It makes literally zero sense from Artstation's PoV to ban AI art entirely. If A…
ytc_Ugzdt9ZZP…
G
People aren't getting laid off because of AI, that is the PR/cover story busines…
ytc_UgyGoOrzq…
Comment
One of the biggest problems in AGI is that it requires far more computational resources than LLMs which are more or less super-charged search engines that can talk. But as the video says, AGI is not necessary, we already (for decades) have AI systems that can out perform human experts. The biggest issue with these systems is that they need to be designed by scientists and engineers. You also cannot talk to them, requiring some programmer to create an interface.
youtube
AI Moral Status
2025-04-26T19:1…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugxo2GjUyERtyIV9xA14AaABAg.AHORAZC0DOLAHQQtQNdBpB","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugwmrb9Qi9ECXEUn8sJ4AaABAg.AHOOeUxaGMIAHPT2Ppj-BH","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugwmrb9Qi9ECXEUn8sJ4AaABAg.AHOOeUxaGMIAHQZVsu6sZu","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugwmrb9Qi9ECXEUn8sJ4AaABAg.AHOOeUxaGMIAHUZ97MRZWl","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_UgzqvP_89QFiSZeh0NN4AaABAg.AHOLyeu0h1lAHOR6mKPKOh","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzqvP_89QFiSZeh0NN4AaABAg.AHOLyeu0h1lAHOTM_eqCs7","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzqvP_89QFiSZeh0NN4AaABAg.AHOLyeu0h1lAHOzGIUf6B2","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzqvP_89QFiSZeh0NN4AaABAg.AHOLyeu0h1lAHP-2c4TwGz","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugxj5C5mPY4raUGzyW94AaABAg.AHOKYrbTpAmAHQB9eajQfs","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugxj5C5mPY4raUGzyW94AaABAg.AHOKYrbTpAmAHQFM9hvrDb","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}
]