Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’s because of people like this guy is the reason AI will eventually turn on us…
ytc_UgzBNfanW…
G
For me, art was never about the final product, but the process of getting there,…
ytc_Ugw2zncUZ…
G
I have a feeling AI will be like health warnings on cigarette packets. Most peop…
ytc_UgwZnuNkw…
G
I believe AI is the image referred to in that Bible that will be created by the …
ytc_UgxRRuR2z…
G
There is no need for wait for AI to kill us. The current American and European L…
ytc_Ugz6LbOJK…
G
Focused on black and white division instead of the robot taking your job ... Nic…
ytc_UgxKX77Rt…
G
This scenario touches on an interesting aspect of how language models like ChatG…
ytc_UgwONEPvG…
G
If you don't think the government already has super intelligence haha! Google sa…
ytc_UgxXrmDpY…
Comment
Everyone seems to agree that AI has the potential to destroy humanity BUT are willing to roll the dice and take the gamble and decide to move forward in the face of potential catastrophe. What does THAT say about humanity in it’s current form when .00000000000001% of the population are making world / human altering decisions on behalf of the majority of the population who are either to blinded, ignorant or just don’t care what happens to the world?
youtube
AI Governance
2023-06-16T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxsclTO-2tjUOevEQR4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxFzdGc2f8_WDX6u5J4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyZr1N6L2gQVNc0WnN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzG-syn2W7hOzlc0754AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxQEuTdQIOBshMYENx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw16FxSRSvvX8rfLKJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzUd3kx_2sUOzdMTS54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwvsbJjzNbPqjyPAc54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyZsz-YdxEMwQfL6gx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzf879sMfBZdXcdOYt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"}
]