Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is cool but damn imagine being a small artist and seeing that an ai artist …
ytc_UgzS7RS8U…
G
Sounds like buying 500$ art made recently should come with a video of making it.…
ytc_UgyHu515Q…
G
So glad to hear those words from the creator of ai itself (metaphorically) skill…
ytc_Ugz2AfWn4…
G
If AI knows everything about us including all our history and pop culture then w…
ytc_UgwSqV8kY…
G
As someone who uses Character AI type sites a lot, I realized this early on. The…
ytc_UgwdLWOeg…
G
AI videos need to be contained in a separate AI catagory, period.
We are sick …
ytc_UgyQdzeE6…
G
so Ezra thinks its better to do vids about rogue AIs and government having AGI i…
ytc_Ugz0PS3HB…
G
Do you think we will be able to collectively organize, as a human species, in pr…
ytc_Ugwg3JKRc…
Comment
Is anyone else scared to click the Gemini icon for help in getting summary of video? They have created something they have no clue what is to come except probability of risks. Just like all the Ai movies that were made and look how they all turned out. This doesn’t make any sense how governments would allow this to continue without safety regulations knowing there have been to Ai execs speaking out about it now. Something is not right
youtube
AI Governance
2026-02-17T08:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzBPTEfirV4HWzyfnd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx454tiBb8Jd2XoWbd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwZzBX80YZJcGswMxt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxgluA8GSvijojRNU54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxsiWl6yveLGBJHZBl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyhmwX330zMsai4R7h4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzvhDNjmF16OZxcQSd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy4Kap1TExeG2fBlAx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxfUDH0zQ4feaMxwEB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxV2bn20-0pngxW9SV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]