Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Job market bout to be cooked and people still i want to compare ai to the calcul…
ytc_Ugz1F-B_U…
G
so many people too busy regurgitating the same snarky quips to notice that the A…
ytc_UgyFAPuiS…
G
Oh no... Now I'm thinking back to when I tried that "look under there" joke to C…
ytc_UgxxBNxlQ…
G
i had a friend who got into a character AI thing and then she just completely le…
ytc_UgyYgYCW9…
G
Hmm, I've just renovated my house. About 50 people worked on it from different f…
ytc_UgzTVeNXc…
G
I personally wont even entertain these ai art stans with an argument. It should …
ytc_UgyDI6aFK…
G
GPT 5.1 is actually a different model than ChatGPT and GPT 4o. let alone gpt 3.5…
ytc_UgzedwDHL…
G
I for one will never use AI art as long as I live because I stand against its pr…
ytc_UgzaeHF3a…
Comment
I would go even one step further and claim that we will become the robots. Like you said, they won't get humanoid robots to work cheaply, so we will do the manual work - that an AI tells us to do. Maybe even with nice AR glasses, so the AI can tell us exactly what to do and we don't even need to think.
youtube
AI Jobs
2025-08-29T05:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzpPWHuBwtrK5Q885h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzsWNYJzeyBZNqsJyN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJ6QXBlMUDa3T58sJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxeWH4OSYdZ9fzzK5F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyn9ykPlnRpeFNc3Zp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwWABZGAh3dGPZqq7t4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzMa8UTw_WGDJ1AnHp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgxahWsCECMivLQcMad4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxJwvMbyJEIXDJmwSF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz5rkr0aRHXc-e7H1d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]