Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
True ai doesn’t exist. It is a program designed to do things depending on inputs…
ytc_Ugyso0sym…
G
Still waiting to hear about costs. Yes, AI doing any particular job will be che…
ytc_UgzzYeRX0…
G
"People should think through how is AI going to reshape our lives." People said…
ytc_UgyMC-DeY…
G
He says in 20 years, robots will walk among us indistinguishable from humans. Wo…
ytc_UgzzsD1_v…
G
Why can't they teach kids ai and programming at the start of kindergarten? Are t…
ytc_Ugy52UjBJ…
G
I don't understand how the headline is connected to the research. Herd immunity …
rdc_g9swydo
G
lol no, they have all the funds to buy weapons. Economy collapse is the worst ou…
rdc_kif8zoe
G
2:35 that’s ridiculous Eldon ring is distinct from dark souls or any other form …
ytc_UgxrvZ_9v…
Comment
Yudkowsky is wrong...about everything. He is a blogger who staked his claim on the farthest end of pseudo-reality; extremism of narrative is his "golden goose"...it attracts antisocial fans in a cult-like atmosphere. He has no patents, not a coder, not a true researcher. He's arrogant and bombastic. His treatment of Stephen Wolfram was a targeted takedown...to show everyone how smart he is. In reality? He's not even in the same universe as Wolfram. He was disrespectful and condescending and has no substance at all. Regarding Yudkowsky's predictions, he takes a very narrow view which looks like it was derived from science fiction; your Colossus-Guardian's or Sky-Net's of the world. He thinks that AI's are either Machiavellian or dismissive/indifferent to humanity's existence. Yet, he thrives in his self promotion. So have many cult leaders.
youtube
AI Governance
2025-12-28T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz8irGap780pgaPYUN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx0mHqIi_C7JqlUV2R4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxHBeFUOmugoVyknvx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxM-aBghc33RgKAPnN4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwz9lk7g386EI-9eyx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzGtAPrGoIn6oXFrhR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxvHpytTHJES6iUFkV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy0kMTg4ZGEN4s39vN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzeMMSDkMeB2lv2WCp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzZB0XUu6-_tfaFNvV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]