Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Something different about this video. Great that you dove into sharing thoughts …
ytc_UgxRrFghY…
G
Help kickstart Haven Social and make the internet, video creation and social med…
ytc_UgwJ_eXCx…
G
What we do know is that the current system is broken and should be discarded…
ytc_Ugzs7bxBK…
G
Thank God. Otherwise, place was going to be full of garbage. Was watching Second…
ytc_UgxA0TouF…
G
I don't think this is real. Nobody would hand a loaded gun to a robot.…
ytc_UgxO8b4OX…
G
You're saying that they don't understand how it's working, but you may be giving…
ytc_Ugx-THrMG…
G
Own the EHEYES food. Invest in dividend paying power utilities that are underval…
ytc_UgxxDu6I_…
G
Technically speaking ai isn't racist it a logical expression machine who works o…
ytc_UgxTSl12J…
Comment
Absurd! Humanoids have always been useless as "workers" and it's easy to explain why AI can't change that.
We see how Amazon and others are replacing people and it's not with humanoids. Because robots ALREADY use Vision and AI and they don’t look like humans because humans are limited. (they have to climb ladders, hands too big for small things etc.) Automation has been replacing people for decades and it hasn't stopped! So if a humanoid could do it, it’s already automated! (1) Just look at history. (2) So NO, they WON'T become useful as technology and AI improves.
In other words. The typical factory job is doing things that robots(automation) cannot do. And humanoids ARE robots. The form does not matter.
Look at it this way. If AGI could assemble widgets or do surgery or manicures, etc. it would need special tools not batteries, hands and legs. Optimus is not strong enough to dig a hole or help someone get up off the floor. It can't even work at McDonalds. If it makes a mess putting popcorn in a cup with a human brain behind it, could it do much better with an artificial one? Can it wash its hands? How long would it take it to change gloves, start a roll of tape etc.? A humanoid MUST do a job on par with a person because workplaces are made so people along with machines maximize asset utilization. That is why you CAN'T just put two humanoids in place of one person.
This is why no one has EVER shown ANY humanoid doing any useful job that a person was paid to do. But AI could make it a fun toy or criminal.
If you have a trade job, you’d realize it would be impossible or impractically slow to do it with robot hands. Have you ever seen it use hand tools? No amount of AGI Learning or training is going to make the hands better with tools, scissors, small parts etc. Making hands equal to a person is impractical and unnecessary.
Hot, heavy, dirty and dangerous jobs. There are 100s of millions of them around the world. And if a humanoid could actually do it, LOL, most of those jobs pay super low wages in places where there is no electricity to charge it anyway.
Most of today's humanoids can barely complete the tasks from the DARPA robot challenge 10 years ago !?!
"Optimus will be the best surgeon" !? When AGI can do surgery, it will use existing surgical robots. A humanoid will not wield a scalpel.
"it's a humanoid so it can work in a human environment" if it's not useful then that doesn't matter.
Who will buy them? Certainly not factory or warehouse managers. They can't do trade jobs or hard labor. They will be lousy housekeepers, tour guides, popcorn servers, entertainers etc. ridiculed as toys and criminals will use them for nefarious purposes. At best it could be a checkout clerk that will restock the store at night if it ever really gets a brain. Credit only, robot hands can’t handle money well.
But they would be really nice if a handicapped person could use it as a telepresence to go shopping, do simple chores, etc. Please make a version like this.
(1) It's first principles. If the goal is to emulate a human then you must make a humanoid/android. But if the goal is doing things that a human can do, then first principles DOES NOT dictate a humanoid. So AI is already "embodied" or "physical". (ill conceived terms, us old folks say, smart sensor or smart tv) And if you do want to emulate a human, you must choose which characteristics you want. It may need sensitive hands to feel slight imperfections or it may need durable hands for hard work. So a one-size-fits-all like Optimus is not practical. Unless it can reproduce, heal and recharge itself for decades by simply eating and sleeping.
(2) Warehouse managers needed people to store and retrieve items. Now warehouses are full of non-humanoid smart robots. People had to pick parts out of a pile and place them with correct orientation. Now one arm one eyed robots do it. People had to pick up parts to inspect and measure them. Now lasers measure them and vision systems inspect them. PID controllers became self tuning. As tech/learning/AI improved, so did machines and machine controllers. Just like computers taking jobs with spreadsheets and scanners. All the while someone was saying a humanoid will soon do the work.
youtube
AI Jobs
2026-02-26T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwyIGsxnqLvxD7mO-N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzb5kWEFuK2PQrBBpV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwP7Bh9IMgB6iON91R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugy2XNPhgBbNFMLbP9p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyIr-zBG_tecqB_gEN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwBbSysL-XG9SsojCJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy_LCq-tFe1PGdm9jd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzvjkfqXE7gAIsVSfJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw9jkbP4TqyGhjmEIt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwuMIMuzdUA94ESwyh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]