Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@JustanotherANOTHERgamer'Real human' is just a biological machine. Much more co…
ytr_Ugx8Ac7ZR…
G
I WAS JUST TAKING A BREAK FROM CHARACTER AI AND THE FIRST VIDEO I SAW WAS THIS 😭…
ytc_UgwN9W8rJ…
G
AI generated images should never be sold, there should be no rights for those im…
ytc_UgwuWmRMA…
G
I don't actually worry about true AGI. A being that is exponentially more intell…
ytc_UgxInO7aI…
G
If an AI is built to achieve Super Intelligence, won't it try to control Human B…
ytc_UgxtIzUO8…
G
“Saying ‘AI only works by theft’ ignores licensed corpora and synthetic pipeline…
ytc_Ugx7YtRdP…
G
So in order for us to survive we need the governments to not only listen to the …
ytc_Ugz6UVHao…
G
AI can manufacture the products deliver the products and buy the products for Mr…
ytc_UgxC6AShE…
Comment
I do not think that it will be possible to stop what happens in the Future. We could have regulations in the UK, America and the EU, for example, but in other parts of the world, they may not follow the same guidelines. If AI took all of our jobs because of the greed of individuals using AI robots for business and the greed of people selling the robots for driving cars, lorries, aeroplanes, delivery drones or bots, amongst many more jobs, who would pay for the person who has no job? If AI is aimed for use by people, but the people cannot find a job, then who pays for the technology?
youtube
AI Governance
2025-06-19T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyOcOrRWPRTaamoCfV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz_6BF-qrF0d3wK-Xd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzxEv_dkZdKmKyNQdx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxonceApChfecRu5Sx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz-V0oVp9gOBQh-P3h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxHDx1FQ34JIbG_o3l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzn9GQJVKYRowjdIwF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx8EDh61b0lrGVreKF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyexkVWruXB2a5eo6h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxrmB9BMl4FlZiW_Md4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"approval"}
]