Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have so much pride in these female journalists I've been watching lately. The…
ytc_UgzSZ30Ab…
G
It's funny that the ai defenders had so many typos in their comments. Maybe beca…
ytc_UgxjxgQEE…
G
If they can't make AI indistinguishable from real videos, they'll make real vide…
ytc_Ugz4WiE_6…
G
Army and Air Force both already had unclassified AI services, and anyone in the …
rdc_nt8pkx2
G
👀If humans can't tell that it is a robot, they are a lot more brain damaged than…
ytc_UgzWcEazC…
G
"Hot" ? F'N weirdo titled this Video!
There is *NO NEED* to do all of this *JUST…
ytc_UgyUSvL74…
G
Alarm of apocalyptic prophets. They seem to be toying with Ameca the robot, they…
ytc_Ugw3z7WV2…
G
I think, way back when it started, it was an interesting thing. Crai yon. I pla…
ytc_UgxnM-1Ms…
Comment
I disagree. I think he is overestimating the speed at which technology can reach superintelligence (assuming AI can even gain agency or free will at a level that could cause "mutually assured destruction") and underestimating the stupidity of mankind and how much we inherently prevent progress. There is far too much greed, lust, power mongering, and testosterone in this world for society to ever put this genie back into the bottle.
The analogy of the dog is problematic. We're assuming a hierarchy of species where humans are far superior to the canine. However, superiority depends on what you're measuring. Sure, I can do algebra, but the ability my dog has to smell and track the scent of a kidnapped child. That's a much better party trick...
youtube
AI Governance
2025-09-04T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwo8mdO5NmUewBKJe54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzAxs4ex4QhQ93IUux4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwwMZz0ZEysZgXab1x4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwXiOKovFRlfqDD9u94AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx04xxwxVSCw575cDV4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwLS2mD7s515J36TGd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwHNlf6qQ-a-Yx0veF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxGldJnPVjHJN5kqdp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxCynoDO1rkGorGogN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyG6SB8c3gLqxvH85V4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]