Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The problem with materialist scientists is they lack the concept of the spiritua…
ytc_UgzJPoQs0…
G
The fact that chatGPT understood the Jordan Peterson reference made me laugh out…
ytc_Ugxo5iuDh…
G
Humans are the existential threat to fellow humans. We don't need AI to destroy …
ytc_UgxJZ1QIv…
G
AI as soon as there is something that classifies as such and we admit it exists …
ytr_UgxwuKyhs…
G
This is a silly claim, because if AI takes every job, who is going to pay for wh…
ytc_UgyDCjlor…
G
The context limitation is the one I've experienced to the greatest length with G…
ytc_UgyvSacdS…
G
If we want to do this properly, we put dumb ai in a body and teach them human ri…
ytc_UgyORz9bu…
G
You see shit like shit and you still 100% know some genius in the military will …
ytc_UgxV8DvKS…
Comment
Every time there’s a new technological breakthrough, it’s sold as something that will make life easier and reduce the need for people. But historically, it’s usually the opposite — the Industrial Revolution, computers, and now AI have all created new kinds of work, often making life more complex. Instead of replacing people, technology tends to reshape what we do and raise expectations, which often means we end up working more, not less.
youtube
AI Governance
2025-09-05T07:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx11H04EQFozpn61Xx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyZ5Nzev-JzVuv2NVR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwGzGO-wgwHM5JK8sx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy_E8SedI6zyQzVVfB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQMCt_tqjnMhzBkTx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzgId4lO3W0oJRND6B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz-90VGNMwGvlFAzoR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyiBCZVVb2zmbAbnt54AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz3uAVse9eYaFShn254AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxOIjmweqVrwVagOCF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]