Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The topic discussed in the video is actually from a chapter in the book Life 3.0…
ytc_Ugwndn9VL…
G
Automation will take your jobs! this is a issue!!!! ai art is cheating!!!!!!! it…
ytc_UgxqjPRlw…
G
Max Tegmark knows very well that the models in 2023 were conscious. Right Max? T…
ytc_Ugy9oV1t5…
G
@BrendanDellvery soon they will and it will be the birth to something called the…
ytr_Ugz3K-5YL…
G
They are. AI has been a thing since it got released in 2010 for companies. Only …
rdc_jj7hgdt
G
These people are under the impression the AI models are capable of inventing new…
ytc_Ugx54aY3u…
G
Im imagining the day in 20 years where i wake up next to my wife and our dogs in…
ytc_UgwL0iro5…
G
27:08 and there is plausible deniability.
“We swear that wasn’t our robot that …
ytc_UgzB0OcPr…
Comment
If they only continue to do what they are doing now, constantly improving without getting to AGI, I would consider the LLM as important a productivity tool as the damn spreadsheet which continues to be one of the most significant advancements in human productivity ever. LLMs are right up there with Computers, networking, and the development of the Internet itself. Personally I have no doubt that they will eventually achieve AGI and quite possibly ASI (Artificial Super Intelligence) in the not too distant future.
As LLMs are right now they have some, as you say, unfixable problems, but they improve every iteration. I believe they will become the basis of modern operating systems. The world of MacOS and Windows (WIMP - Windows, Icons, Mice, & Pointers) will vanish and be replaced by dashboards and conversational computers with LLMs and custom automations at the heart.
Back in the late 90s many people had given up on self driving cars as well.
youtube
2025-11-27T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgycAHJI6QF5fdAG-lB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxblKZaLzAT3mu6TIR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzc0eBBvKwNR0D0OyR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzdrVJ53CPxqnfxNPh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzs2q2y09RHe1gU1Td4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwqAMRbRMphohcakdd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhAR1gvu95cACD9_54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwhQ2fUeLaSnGl4sIJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwtE3N6RYpad645Cp14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxINFB3kenh1VjuTql4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]