Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Army should wake up, because even them will be replace when corporate like X hav…
ytc_UgzReGOq3…
G
GUYS PLS DONT USE CHARACTER AI ITS HORRIBLE FOR YOUR MENTAL HEALTH AND YOUR DIGN…
ytc_UgxSCcBXQ…
G
Oh sure, automate everything. Lots of wealth to go around.
So we can do communi…
ytc_Ugy6hYTZ6…
G
Do people not get what art is? It represents what makes us human. AI is currentl…
ytc_UgyneHu7i…
G
Idk why some of you even call it Art. I always called those AI images or videos.…
ytc_Ugzs7aVrU…
G
I wonder if all our Tech "Upper Management" are watching these videos and then f…
ytc_UgwEOu5Nl…
G
If they replace us with AI there will be no more consumers to purchase products …
ytc_UgxvIUvJD…
G
Ai will never be creative and will always require a creative human to use it wel…
ytc_UgyhYbFSx…
Comment
I love how everybody thinks Elon Tusk is so smart but if you really listen to him, you understand that he only has trivial knowledge, surface knowledge about deeper topics. The problem is that the general population has even less than surface knowledge of deep topics and thinks he's the smartest dude.
Look again at the second question he's given "what are the exact dangers of AI?" He's blabbering on about "dangers to society" but can't even but the finger on it. Instead he laughs about "it sounds like terminator" but you can see that's exactly the extend of his knowledge there. "something like terminator"
Then, at the end, tucker asks the question again because tusk hasn't answered it at all. And all he comes up with is the dangers to social media? An AI writing posts on twitter?
LOL this guy is a rich sham just like so many others. Like trump almost
youtube
AI Governance
2023-04-18T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzyCAGAEUyMoPg1YUV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz90xk3jw_FeSEGoYx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwAgfY-AWMJzrZ3RW94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyh8jUcUxQdvA7JoCN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzRU9wbEjA2os2s02t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwjHxfNS8zEvAwYuDV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyMu9BAFLyDUm_Zutd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzRtC--xntBBEwAUDV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw0EIttSl3g3zlsxud4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw8ccmyjDoMcAOMQxp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]