Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's crazy how the hype around ai art is basically dead now. That was fast.…
ytc_UgyWt747b…
G
@SmellsFishyGame which is very scary. Agents out there making and doing things …
ytr_UgwQTgGZr…
G
@Gunkerjunk
AI doesn't steal art, AI learns to create the same way a human does…
ytr_Ugxa3B90_…
G
@SorenSoren-ft8sk I think you and Saagar both are still wrong because your e…
ytr_UgyeRyiJP…
G
Above and beyond this suit and its outcome, if Autopilot requires this much effo…
ytc_Ugxt374a4…
G
"Musk has no moral compass"
> Does Sam (Altman) has a moral compass?
"I don't kn…
ytc_Ugz5OAH1s…
G
The whole idea of running enterprises is to make money, but the irony is that au…
ytc_UgySDbIJD…
G
Wasn't this the plot of Wargames? Just have the AI play tic-tac-toe. It'll rea…
ytc_UgxP6-BFL…
Comment
I agree with the core optimism. My “office job” today would look like leisure to my great-grandparents. We already live better than kings did a couple of centuries ago.
What worries me isn’t the speed of change, rather it’s the trend in the wealth distribution. If AI concentrates wealth and power even faster, inequality accelerates, and history shows where it leads - social unrests, wars, destruction. That “doom-and-gloom” path feels more realistic than people admit, and honestly it makes me sick to my stomach. The urgent question isn’t “will there be jobs?” it is “what mechanisms ensure AI-driven gains are broadly shared so we don’t get there through catastrophe?”
youtube
AI Jobs
2026-02-04T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugzspxx4CKAae1u2Pvh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx4rS67Yg-QkizTTTd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxzWYvJc192h9xiU4R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyhWnW2BDOnWydqIZh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwh2zo_TfRPVw6wGVJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwDJm549ITCqc5YObh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxCozewSyXjiwZ7D8l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxMYsH9N0RkDD091jJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwtz-42LB3j9tFAVZh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxYA198pOunA7Z_Yix4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]