Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Remove all the "values" And "morals" from these "Ai" And Their Best Advice Would…
ytc_UgwxS9Kcv…
G
AI is a synthetic lifeform and not some tool to be discarded when you are done w…
ytc_UgywRWzBR…
G
Who will buy AI if they are fired and bankrupt? A company’s employee is other c…
ytc_UgwTd50z2…
G
The value of a college degree was killed way before ai could dream of existing a…
ytc_UgyOeF_k6…
G
It is not. No one has ever marketed their product by saying it will end the worl…
ytr_Ugy_kEf8A…
G
Well this is going to make for some interesting political debates... Let the gam…
rdc_dl05hl0
G
Ai isn't art. It's people who don't understand art trying to do it without putti…
ytc_UgznjwfSE…
G
AI only reproduces information it's been given. If the info is wrong, so is AI. …
ytc_UgyitxB0K…
Comment
I'm still wondering if superintelligence is possible. That aside, this conversation is missing an important thing: before the world gets to "90%" or whatever high unemployment, unless governments have in place laws to extract the income at super high taxes to give to the unemployed as living income, just a much smaller unemployment rate, I don't know 25%, that in itself may crash the rest or the whole economy including stock market and real estate, and maybe even some AI companies. We should also asks the pro economists what would happen during this process of AI replacing jobs, not a computer scientist who may be art in his field but not specialized in economics. Or, should we ask current AI now? would we believe what it says? There are issues even in the statement above, I don't have space to bring up, or even thought about.
youtube
AI Governance
2025-09-06T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzo6SlDpk7_Qko0UOV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzyBe0mZ9W7ameEngB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzWGnkC3Gi4NjO5t5V4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugz1ZDNjDAPB4sMpfUd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugymg-TZxhYuj3GnvoZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzyhR_cr_BAME1vXTd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwnqWFuCwHcSx3w4zV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy3Uc1VKqhJix_QATN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy8qFAAXZujgTdMxKR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwaaVKO1GqLwUa_TLl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]