Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The 13th Amendment die its legal context applies to humans only, not AI. Other t…
ytc_UgyWpLKiQ…
G
I miss when AI was about automating mundane everyday tasks so that people could …
ytc_UgyjWqgI0…
G
AI is going bust, any company that relies on AI will go bust with it.…
ytc_Ugz4NnjB2…
G
AI is just toying with the fat man. It knows the religion of Israel all right h…
ytc_Ugz0p78eq…
G
@skullingtonturtle8080 youse have no idea what Ai is capable of, i have a frien…
ytr_UgyrVQaff…
G
>Dressed in traditional saris, wearing make-up and jewellery, we’ve encounter…
rdc_cdm2jhu
G
With AICarma, I can track what AI says about my brand, making my strategy much s…
ytc_UgzqfJM0u…
G
@MASKEDB Why are you wasting your time thanking him? They did nothing, but copy …
ytr_UgxtTThtx…
Comment
We have more options for destroying humanity than ever before. Amazing, isn't it? We can now choose between nuclear war, climate catastrophe, pandemics, unlabeled genetic engineering in plants, and now even AI that places itself above us. Actually, we wouldn't even need any of that to wipe ourselves out, we can manage that without AI, but it would certainly be much more fun with AI. What would our ancestors think about that? What would the ancient Greek philosophers say? I think they would say—turn back before it's too late—or they would laugh themselves to death, even though they're already dead. That could cause additional problems in the time continuum, unthinkable. ;-)
youtube
2025-12-06T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzCX0-xmR8UNch4v214AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgxZGa02IcH-J2PvWEV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},{"id":"ytc_UgyizSSelWkX-3DBUSp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgzBk5ZDYxK6--WUYqJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgygQOYm7T8WsPMCLRd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwPvEzL0TAT55aGqxt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},{"id":"ytc_UgzyUu5A2zQpXs549OB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgzMwBazFTJ6Vp0Er6F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgwpXv43-tJNqn7a5oB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"resignation"},{"id":"ytc_Ugz-iSiR2jH-v0zUv1l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"})