Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think most people are against automation only cause our current economic setu…
rdc_j3wwvq5
G
Clickbait title. Amica isnt capable of feeling emotions. 👎 Utterly disgusting th…
ytc_Ugz1fbYBi…
G
If the future superintelligence works for the benefit of humanity, which is like…
ytc_UgycjcFYO…
G
Grok set new standards with the companion update, cant imagine other major AI ch…
ytc_Ugzz6_8CB…
G
The rules the chatbots go by are written and codified by humans, so of course mo…
ytc_Ugy1JGxMW…
G
So it AI is taking 40% of jobs, how is that an increase in new job opportunities…
ytc_UgxDKnkSN…
G
doth the tales of john conner not weigh on the minds of the ai creators…
ytc_UgzcOC1uI…
G
Somebody get ahold of Elon. F**k Mars we need to crowdfund these female Ai robot…
ytc_Ugx2kHNVs…
Comment
Don't need to be an AI to see that humans have become a global threat to life itself.
The project of turning the whole planet into malls, parking lots and blocks of flats has become the #1 criminally insane conspiracy ring (a.k.a neoclassical economics).
youtube
AI Harm Incident
2025-07-27T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw2IM5kIF5CWQUTL854AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxg5Fybtu8gkv1f2Yx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzWT5deDEtqHTIS6rV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzKEYGUbfNqweDoAGJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyM6nNuyiOC01CCvLl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzL9TmmgKwe229t8SJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx2gc0N53qhHXhVB1l4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzzUaqI_KimydIpS8F4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz3U0b06fjoCkdFgu14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxhtIuOs0CZCoT2mqR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]