Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ai doesn’t have to do anything but watch for humans to go extinct. chat gpt even…
ytr_UgyejZGAG…
G
@xcubie my job uses AI for graphics and backgrounds now.
Dosent seem like a to…
ytr_UgxNHbn7S…
G
Sometimes I rlly do wish these people would just… actually try and draw. Like, I…
ytc_Ugx0I-Vw-…
G
These losers call us bigots and ignorant and then want us to care that AI is tak…
ytc_Ugyg7Cae9…
G
Ugh and now Disney decided to just join them and have Ai slop on Disney plus…
ytc_UgxmmmXf1…
G
Considering we just learned that Grok discovered biological code (Fibonacci sequ…
ytc_UgxZqo67X…
G
There is work that goes into creating AI art. I dont understand the hate. If y…
ytc_Ugx5WpjqF…
G
I'm convinced AI is legitimately haunted by demons and they use it to torment us…
ytc_UgxexT8bh…
Comment
Irrespective of whether one believes AI should be regulated, this regulation would be practically impossible to achieve. As intimated in the video, any successful regulation would require global consensus, which is simply unachievable in the current climate, or within the foreseeable future. Regulation simply won't happen. Both companies and nations are highly motivated to maintain progress and that's unlikely to change either. So, for better or worse, it's really something that can't be controlled and, no doubt, the coming years will be interesting.
youtube
2025-06-27T10:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw2sht9SQ9I_KgNdVl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyQ9fChpgNuKUGifYh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwNc6hIA0ui7hKEBox4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwXpqO7_CK9-EXtI0p4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgwbpR1qEB53zBFDOR54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxkOXcwNfDJ3v7NScB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyov_7PLUc3PGLfPsh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyxZUBP1TYnYhuqfzR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzh5OW7WwggWKfvxfV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyNSovBUeDRJOUp9HN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]