Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
anyone who utilizes AI, for any reason, is directly a part in causing this. STO…
ytc_UgyRBvHYL…
G
@shiramaro i did. I still don't like AI art don't get me wrong, I hate it, but I…
ytr_UgwyHxJLy…
G
Man I can see terminator happening and the world is going to get taken over by A…
ytc_UgyEbtNnE…
G
What AI? It's LLM (large language model)... true AI is one that passes the Turin…
ytc_UgweKhRjy…
G
Hello everyone! On april 30 we are going to STOP using ai to help the Earth! Al …
ytc_Ugw4Lf2_k…
G
Closed loop AI is a long way if ever possible; Open loop is a good use case, mea…
ytc_UgyylpOtH…
G
Dave raises some very poignant points here; it would be also interesting to expl…
ytc_Ugx2ZzyG4…
G
I understand this all. I hate ai art with a passion or fake things. The only exc…
ytc_UgygJi2Jn…
Comment
Steven Bartlett, one idea worth exploring in your next interview is what would happen if the U.S. unemployment rate suddenly reached 10–15%?
Certainly, it would alarm the government. But perhaps more importantly, major companies would face significant revenue declines due to reduced consumer purchasing power. I can easily see Netflix being one of the first to feel the impact.
This scenario brings hope that companies themselves might push governments to regulate AI to stabilize the economy. Unless, of course, they take a darker path: proposing to fund part of UBI subsidies in exchange for a green light to keep laying off workers and replacing them with AI or robots. That would be a dangerous bargain, and a mistake if the government ever agreed to it.
youtube
AI Governance
2025-09-17T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwGB54AwVzp0bqKxpN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxh3tIpqJrLxdaBx4x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw7MQfpnC17Xd1jnLB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwQ0R-_5PuyYqNB7CB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy6Z3zU63FrODtOoTR4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzWEEHJgI3PGog2cmx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyQolMPvsu-HD6yVtl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwlqSgeeaOteBgNBPR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyhJlEu0pI9ED9EA5t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzlW7bjWGG4TIeRrIZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]