Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its ok I think they waist more on charging these damn robot electric 💡⚡ must be …
ytc_UgyHSz7KG…
G
This is why they wouldn’t need humans in the future look how real the AI looks l…
ytc_UgwaAgI78…
G
They’ll want all the jobs that can’t be replaced by Ai to be menial jobs doing d…
ytc_Ugz4KFCQl…
G
A journalist with more credentials in the AI field (Ezra is just way too much of…
ytr_Ugxcc7er2…
G
Enginers Cent be replaced by AI. Just my oppinion… Also cleaning is way more lik…
ytc_UgzOXINeH…
G
Fake, the robot is programmed to work. Even if it malfunction, it would’ve just …
ytc_UgyuicbXW…
G
For now this is funny, until you realize that in the near future both clips will…
ytc_Ugxf3PRLD…
G
p The trajectory of technological advancement points toward an inevitable integr…
ytc_UgzctA-nx…
Comment
I hate to be negative, but I'm getting a little tired of the general technoptimism. Saying "we're just one revolutionary science paper away from a new boom on AI" is strange to me.
Technically, sure. But isn't it the case with literally everything else? :) And who's to say if this paper can eventually come at all, and we haven't hit the ceiling for this tech.
Maybe it's the bubble I'm in, but I think we've just passed the peak of expectations and start barreling into the disillusionment.
The only thing I wonder is how far we'll go with genAI. Will it be as useless as Blockchain or not. I'm erring on the not side with some clear use cases present, but I don't think it's gonna be the game changer people seem to think it will be.
youtube
AI Responsibility
2025-10-01T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxwMAnciHw7LkWvmLR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxlFBPE0zkei83oeXN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyKCoONbZBiv6Dk-cB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxT7CkFlan_8u_k4D94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyUtWjYCBMnk1-oJOd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzKlKEmpRmtEVHd99N4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyftXnjXbdcAPS2Cg54AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxmjZ4bdnR1fq9xJWt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzNxAK9B05CXsCXouh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzIoTNTPJzcd8I6W5l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}]