Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Alert, alert now they are having you scan your palm prints when you go for your …
ytc_UgzvHHn6y…
G
The only problem here is the legality of it. People just shouldn't be able to pr…
ytc_UgyUQ9tvy…
G
Yeah. This makes me wonder if big studios will continue to allow AI to be availa…
ytr_UgzPTgRhY…
G
This dude looks like he ate the AI before it could get out of control.…
ytc_UgwKW99SR…
G
People taking AI technology that will intentionally make you take your life beca…
ytc_UgxsGSSjn…
G
This is proof that AI will never surpass humans, because humanity is its only ba…
ytc_Ugwjt6-dJ…
G
I actually asked Google Gemini something along these lines, and this is what "it…
ytc_UgwT4GLDV…
G
Given how poorly (safety-wise) the bat bomb went during testing, I can't see aut…
ytc_Ugx6im3wW…
Comment
Since AI is still at its core just rehashing existing answers based on probability. Its still completely unable to think outside of the existing box. Therefore anyone who wishes the slightest bit of ingenuity, flexibility, creativity or anything else that makes a person genuinely stand out. You still need a human.
If companies want to stop disrespecting their employees and start using AI to streamline their maintenance pipelines etc. then maybe it would work. But as long as the rich keep disrespecting the people actually doing all the work their new techslop will continue to fail.
youtube
AI Jobs
2026-02-28T18:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzbAaozfAzW5cXGa-h4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzTL3bh9FdTKA9r8XN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyO7sqxiPjfmj7Zgr54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwfzhbu8w2u8Lw-Sil4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxcG8osm9Miou2lH9t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyUHmVhBc0RJ8xYGPB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx4WCuiZDdPdL6uHUl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxTJSmfP7Eri_-K2bR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxSfeBIfiAKRNvIrpZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4XFaK7j1c506UEIh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"}
]