Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I now think AI is what will do us in as a species.
We're just petrol monkeys tha…
ytc_UgxK3wDeV…
G
What goes up must come down, and when this whole AI bubble comes crashing down i…
rdc_ohzyct4
G
Great discussion. Reminds me that the disturbing thing about Lord of the Rings…
ytc_UgzEqJedD…
G
Performance of the US and China top AI models are very similar on a per token ba…
ytr_Ugw74p-SZ…
G
Sad aftermath of dishonest journalists & politically driven media.
Hopefully the…
ytc_UgwbhjkJK…
G
He said they don't know what they are saying? Really! I believe hus the male Rob…
ytc_Ugy87Ym92…
G
18:01 providing isn’t thriving. Robbing people of their right to hobbies is a cr…
ytc_UgwVVn5I-…
G
Damn, this is partly why I keep my old sketchbooks and show people my old art th…
ytc_Ugz_LEv_a…
Comment
Actually, I think that alignment towards submissivity will be easier to achieve than alignment towards the strict forms of rationality that necessitate creation of explicit operational goals (and sticking to them despite imagined reactions informed user would have).
At least current LLMs are far better in roleplaying (as biased or otherwise irratonal personas) than resisting not entirely rational user suggestions.
youtube
AI Governance
2025-11-14T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyluCUitRT-tHv83dF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwbckJa_MricI8LX194AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxbDbzuWlB6OpXOvdx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxq2vdvx-TJjXhPZjd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugza87ESaIPhCKdr1AZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwktKDszx9_U_oQlnd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyzUp3DZJD_UrTCWT94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwV6vIelsVJwa6bMmh4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzvolYXznm83c_2PTh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwXBv6qQtQ_daN1Go14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]