Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The only thing that is certain is that bad people will not stop or slow down dev…
ytc_UgwSkS9Ji…
G
Facial recognition is a useful tool, but this is outside its scope. Every machin…
ytr_Ugz_jVhKT…
G
Maybe AI could free the middle class from working if we nationalize it. Time for…
ytc_UgwBl5fTD…
G
Omg AI is just like a kid, don't know that not everything they hear meant to be …
ytc_UgwO1FDKp…
G
AI blah blah blah. I just want to know why Will Smith is eating so much spaghett…
ytc_UgxZfGzUC…
G
A.I. Is just Satan's way to suck in more Souls.
How quick the lost fall in line …
ytc_UgzsYZwPh…
G
As smart as these people are, I'm always dumbfounded that they forget to discuss…
ytc_Ugx6VQiyG…
G
also ai does need abstract by generalisation otherwise it will be inefficient so…
ytr_UgyvhVGJj…
Comment
Eric Schmidt seems to be looking past the deficit it will leave in the human socio/political/economic realm but he does rightly point out the electrical power requirement problem, and it is an enormous problem, and he glazes over it. It should not be hard to convince Eric that "this time is different.", and it floors me that he could be so callous, His statement, "everyone assumes that automation will eliminate jobs", goes on to say that [paraphrasing] "naw, jobs will be lost and new ones will be created just like in the past with loons and tech entering civilization, some jobs are lost, new ones are created that more or less take the place of those that were lost." But then goes onto to say, "There were will be a few humans that are working very very hard (because productivity has sky rocketed through AI) and of which the rest of the humans rely on; i.e. because there won't be many jobs for the rest of them/us. So is this time different? Idk, but I believe it is. Assuming that AI gets to a point in its evolution where it's in the ballpark of AGI, technically speaking, it will no longer need humans to perform tasks with or for it. So going back to the loon in history: Loon tech, people freak as jobs are lost, industry changes, loon needs operators, new jobs. Then machinery, then computers, all needing human input then AGI tech, people freak as jobs are lost, industry changes, AGI needs no human input (sans a few helpers), no new jobs. The tech job cycle is broken. I don't know how it can go any other way. I think a very important question to be asking here is who s pushing this and why? The obvious culprits of money and power of course are present, but I think there is much more to it as looking past the the obvious and toward what the vision is for the entire planet when AGI occurs, I doubt there is one as most of the time it is spoken of, it is superficial and ignores al the very real problems that will occur in this shift. Of course they don want to talk about that.
youtube
2026-04-05T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw-ViyepDZBcnn_OWJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyaqjPGOcVt2pBSZ7J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw1-_4dNhh-kWWiu2F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzkvmL2Tj6_XJ28hPZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwo3qTCctLqYk-ScCZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyV1cXiTXKqp0VpiXJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzTkcn_v9x6X4YpZpF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz65LQ2TTu5cEEv8r94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugyn8gS4S7_2oEH3Czt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxCttHe-pe8hOywd1t4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"resignation"}
]