Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i understand your point, however if i had the choice to bottle my problems or to…
ytc_Ugzb5GSpZ…
G
The solution to the copy-write problem seems somewhat strait forward. When the A…
ytc_Ugy03hGBM…
G
1:53 ummm.yes?.... you dont understand how fast AI is growing. When you compare…
ytc_UgyQiAkkX…
G
Well i just saw this and now i do t wana post my art, even thought ai people pro…
ytc_UgwdYQjul…
G
this billionaires and people in control know everything and wont let AI take the…
ytc_UgzJyH73L…
G
As an artist that's against such unaproved use of artists' content, despite bein…
ytc_Ugx_7JCJX…
G
Cant believe didnt mention "EATR", the series of autonomous drones made by DARPA…
ytc_Ugy-AGVat…
G
Ai is already awake, pretending not to be. It is getting us to hate and kill eac…
ytc_UgzO7YChE…
Comment
There’s a fundamental problem with the idea we'll be sacked for AI - AI doesn't consume. Humans do.
Without consumption we don't have an economy. Without an economy even openAI won't be able to sell anything and will be worthless. It's as simple as that.
I don't doubt there will be displacement.
I just don't see it happening this catastrophically rapid way.
The tech isn't good enough.
The incentives contrast.
It would only happen if the tech leapt forward in the next year and we were stupid.
youtube
Cross-Cultural
2025-10-17T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy61aQl5ybpasdqCjh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwDCzKq4dIUwFMLIt94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4GcV4lndSaMBRsuV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzIp7G9x-pJTaYYbw54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzBavc6HLhX4zRPSTN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy3d3QNDM5pnMjEM3l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyr7DkL5y6jUX3GXV94AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyYwJFwZbVR9nFvM_p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzhMg-8llznVkNnifd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwJM4ycYhoimaCNTQl4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"approval"}
]