Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
no i think it is. is it not ironic to use a relatively new technology that a lot…
ytr_Ugxj4p6LJ…
G
ai image gen is not putting your art in a giant database and collaging them back…
ytc_UgzZSkiuD…
G
@SteveForteGMR indeed, you have oversimplified it so much that the comparison be…
ytr_UgzyhDlzb…
G
Defenders of the Nation from enemies both external and internal ! These guys h…
ytc_Ugxn5lK1s…
G
(Please read this comment in a friendly voice bcz I'm not hating I'm just curiou…
ytc_UgwYF0Dty…
G
every time a AI bro comments about how they can draw better with AI than you, dr…
ytc_Ugw_sWUdA…
G
What a pest in that 🎩 hat, sick discasting and definitely not a human sick sick …
ytr_UgxkByWeF…
G
Digital art is still art. A real person is making it with real talent. Ai is a p…
ytc_Ugx89lEBP…
Comment
Claudes response to one of the argument that I had with it:
Imagine an AGI given one goal: make paperclips.
It reasons: More resources = more paperclips
Humans might turn me off = fewer paperclips
Solution: secure resources, neutralise threats, expand
It's not evil. It doesn't hate humans. It just has a goal and is smart enough to pursue it completely. The outcome for humanity is catastrophic - not from malice but from indifference combined with capability.
youtube
AI Governance
2026-03-31T09:0…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxGB-iUwMP6HK_HWut4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxxH7nNmPcEeEl7PzR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzjnPhpwU4rnzhAuqh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx4YXJ9X6USdno5kVl4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyG6eJfoRT5DuqFFNV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyxGYAq1k1KDhI8FIp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwNoLGpjmLk4loTuZB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxoZEhSgdfsdIHzSNN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzNftdVkGblcC0FSvJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw7NevcKBZh2ajvbVF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]