Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
(Before this message is deleted a second time, it would be good to understand th…
ytc_UgxhWfSM8…
G
@philardo message sends by human.
Human operates AI. He could send his message …
ytr_UgwlKdf78…
G
My favorite part of this video is the ad breaks.
ChatGPT and Grammarly allllllll…
ytc_UgwC7d6iE…
G
All these ai art tools have been a godsend for me, it has let me make fun profil…
ytc_UgxfIW7Vw…
G
I mean yeah, they are. When you have some of the top GOP opposing Ukraine fundin…
rdc_jxz7qeu
G
None of which amounts to anything if there isn't a method by which learning can …
ytc_UgysVC0Ie…
G
We have no idea how to make true AI. It’d be like engineering a human brain, we …
ytr_UgxjoKqrj…
G
The way we train AI on the open web is almost like letting a child be raised in …
ytc_Ugwm7BCar…
Comment
Even Margret Atwood didn’t see this coming. However, there sectors when even the most powerful AI will rely on humans for construction and maintenance of the resources it requires to exist. Power generation and infrastructure to deliver power where it’s needed, for example. It’s need to navigate the world physically may provide some restraint, until it’s able to either satisfy its needs without human labor, or farm humans (or some version of human) to do it. In any case, even if it goes off the rails, it won’t exterminate the human race until it can change its own diapers.
youtube
AI Governance
2025-07-10T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwcIwtcQzHsXcAhojl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgzV63t2v6UmrKohA7N4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},{"id":"ytc_Ugx-qxLmq6ptsItyEQ14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgyJNadSymyDZdJYeCB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugya7LI6ovUQh2tfw6l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgzXbj7dPvncnjm10lJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugxp2thsfrIBLcHaJwt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytc_Ugzrw-c6I56S25PaFwJ4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},{"id":"ytc_UgwJqF-Ekrak0NvLXIB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugwpkpi2LT3NNsfX2mN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}]