Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Always making it political... do you realize that using AI to COMBAT HUMANS was …
ytr_UgxNnWzdn…
G
I don’t understand if this is so bad, why no one tries to stop it. I am blessed …
ytc_UgyKmZ7lx…
G
Yeah Gary is just a tad bit invested in AI being huge, isn't he? LOL ... I don't…
ytc_UgwHzGAdg…
G
There are many non-consenting deep fakes of public figures. Shame. On Youtube fo…
ytc_UgzfTOKX_…
G
OK, I kind of knew this is where AI is going, and I think in the future of AI, e…
ytc_Ugx93jytf…
G
If there is UBI it should first go to those displaced by automation. Along wit…
ytc_Ugznql3tA…
G
Technology changes. People don’t. We will always have to deal with people tryin…
ytc_UgyNh4CfY…
G
I think Scott was exactly right when he said basically nobody is going to start …
ytc_UgzgNakNI…
Comment
I remember an old movie. All Prisoners should wear a neckband contains explosives. The warden can monitor their activities. If any prisoner behaves badly or try to attack others warden can simply press a button, the prisoner will get killed. This video shows the similar idea. We should not glorify these kind of innovations with an AI tag.
youtube
AI Governance
2019-10-09T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgyTnqsd7q-hOODhadB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_UgwF2ZVkYGHSk4n8TiN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgzA032hpqIhX7L17GZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},{"id":"ytc_UgzRuVGZ9ZUC6opkpEl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgyPogWzqQBmGr2UQHJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugxh-h-7OxyTBh0w4Th4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgzeEs-1lYDVCj3LQXR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgxRud009y2yK_RQkWJ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzN98kKiCNW9ym46b94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_UgwwZOm1LWmK4jm_VlZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}]