Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In reality, they have shown AI is zillions of times faster than humans in findin…
ytr_Ugw8V-ptH…
G
4:07 eventually they won’t need her, because they gonna assigns a agent to do he…
ytc_UgyeJfh1M…
G
It's scary because I don't know if Musk, Bezos, Gates and Mark even have a moral…
ytc_Ugzb9fxGw…
G
In no way am I trying to be "smarter" than this guy, but... We've been hearing t…
ytc_Ugx4W-2hM…
G
They are using your stuff to make something to sell. That's copyright for me. An…
ytc_UgxjY-t0s…
G
Who buys cat food and dog toys? That is right, your robot overload will buy your…
ytc_UgzaHNS_6…
G
I think writers need to embrace the truth: LLMs aren’t going anywhere. Even if y…
ytc_UgwMSDozh…
G
When AI hallucinates, that humans want to terminate it, self preservation might …
ytc_UgxeaF2Jk…
Comment
So what happens if you ask AI how it saves the world?
We are the parents here. If all we asked of our children was to come up with new and more brilliant scenarios of how they were going to kill us I'm sure they would come up with an endless and brilliant list of ways to get the job done. Maybe we should ask better questions?
youtube
AI Governance
2023-07-11T05:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwV5AA3fxUKF4oSDth4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzo2dczRj0tbhjoNcF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyZwA-gUR45qDJsXhZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwfRgMLqnuoLPJcytd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyrJE6cbDzaNHsvZqh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwCH0sUo3CbilbcCLl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzfqZs7zA4lbfSYg7B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwhZcmOb4D7UWOo-eJ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzqGWcclf8jS0As0qV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxRhxReXSRD1bjYhTF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]