Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Big corporations like Disney will throw tens of millions at buying politicians a…
ytr_Ugxddd6Al…
G
Ai is already scary especially that AI drone test where the drone attacked its o…
ytc_UgxEldRxn…
G
Right now I'm debugging a set of queue jobs that are triggered by other jobs tha…
rdc_jigg9w6
G
Just as this question to chatgpt “in growing population, with all the automation…
ytc_Ugydn0WHl…
G
I think I should get compensated for the amusing my personal data from the inter…
ytc_Ugy_TPbJ1…
G
DISCLAIMER TO ALL GOOGLE AI MODE USERS!!!!
As a user myself, & an expert in na…
ytc_UgyxH2YxH…
G
This is why I'm glad the education system is being targeted now.
Tired of the s…
ytc_Ugwi0wr3A…
G
another thing is censorship, too much censorship nowadays in US, with the banks.…
rdc_nikkrpo
Comment
Ezra has a point about humans being in continual negotiation with AI. But he's contemplating a widespread effort with the spectrum of mankind having representation at the table. Peter Thiel on the other hand doesn't seem to want to bother with most of the human race, and in fact has publicly referred to Yudkowski as "the antichrist." And these are the people who will be doing all the negotiating with AI.
youtube
AI Governance
2025-10-16T02:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxYZZUWf1e0BmiKVjB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyj61IC9y4O1eajFIx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw3j7ix_m4O6fjeX954AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy4-TMZuxbJYngQ8Mx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw58XvKpbBYlzWFchJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyGHqp3D-7GTb5h_Id4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwzjuhdXejZ9g-vYPJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxEeUhfSG0D_ImVweV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzohPviAeIyf6Vgdm54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxjhUMpviZsQdzeHKJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]