Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I fucked around and pretended too hard, now I'm part of some "AI leader" initiat…
rdc_o8gwgii
G
Oh I completely agree. My disagreeing is when people hate on someone just for ha…
ytc_Ugy8DQN0_…
G
I’m not going to a hospital with AI doctors and nurses, and sure as hell not get…
ytc_UgwMwN0ru…
G
Whoever calls themselves an AI artist should just remove the artist part because…
ytc_UgwB3hBa4…
G
Dinner = save child. Shoes = save child. Morality = save child. The human change…
ytc_UgyuUHOh6…
G
They get really close to discussing something I've been thinking a lot about wit…
ytc_UgyfqT-dD…
G
I’m pretty sure I don’t want a Chinese robot loaded with spyware in the surveill…
ytc_UgwDuG0Z2…
G
What if all this stupidity is because they plan on using Anthropic to turn AI ag…
ytc_UgyJECPLd…
Comment
"meeting peoples' energy in a conversation" - This!
What I have seen a lot of over the past couple of years is exactly this... I find that if I am convinced something is possible, then the AI also becomes convinced and will grind away on any given problem until we're going round in circles with zero progress. Yet if I start to believe something may be unachievable or that we're following the wrong path, the AI often starts to suggest that maybe we cannot complete the task and should just give up.
youtube
AI Moral Status
2025-11-01T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwDx3DQjiqU2qJG6FZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwTK6k8Aqw9vNPIK-94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwei_7KP3azDFb_-Pp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyjvbECDnG4bkxbxWB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxQrs3xC8lMDghTtEV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzVkOt8_Xb97UiZNcJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzUgLam1hNwDO55mjN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxTrEIy5Yb9WlaNc6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxVPdJuAHQIJOjuimN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugych_K1BB1AgP2OzlV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]