Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So you do these tests to see how dangerous AI might be, they seemingly come out …
ytc_UgyM-vB_S…
G
So the AI model steals content to train the model, but the company that is also …
rdc_oh2fi6u
G
The ideal business has only the owner and AI. Press the AI auto button and go on…
ytc_Ugzc_HakO…
G
The closest you can compare AI art creation to is giving an actual artist very d…
ytc_UgzgcVg0a…
G
@event__horizon yeah in 2026 its already gonna be voted on. I will bet on it. A…
ytr_Ugxkc8jJb…
G
The benefits for AI are going to be for corporations and government control, but…
ytc_UgzMvyRGU…
G
So program them not to feel pain. Or maybe ask if we give a fuck because its a f…
ytc_UgiO2aYAq…
G
Do you think the AIs are selfish, underhanded, and sell people out because they …
ytc_UgymmKBCp…
Comment
Neil— I’ve admired your work for years and respect you deeply. That’s why I’m making this comment. Your moderation of the Asimov panel discussion was, by any fair standard, a failure. You were arrogant, visibly biased against AI, and you did not take the subject seriously. The repeated bursts of laughter were inappropriate and dismissive. You were there to moderate, not to participate as an advocate.
I have no objection to a rigorous discussion of AI risks. That conversation is necessary. But your approach that night was abhorrent, and it undermined the credibility of every concern raised.
I have never seen you fail so badly as a communicator.
You are better than that performance. I hope you recognize what happened and find a way to correct course.
youtube
AI Governance
2026-04-09T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyITnnQi63Jc1r4u914AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxFtoJszr3v3g44IpR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwuKy6QufX8TCo7_Ml4AaABAg","responsibility":"investor","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxfeTaqrBLqGNaIWnV4AaABAg","responsibility":"moderator","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwZPomTG_RHLiPkXlx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx8SETfOMr-czgQlId4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzijvHPfgviqRnbT_F4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwmV79b2N6I3xGx0np4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwi_oUaa1CyKC2SdIV4AaABAg","responsibility":"system","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgynScBmISJsQaXR4014AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]