Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why are we trying to create something that is more creative than everyone? And h…
ytc_UgxtEnF2R…
G
A human's intrinsic character is to follow the least resistance path. It is very…
ytc_Ugx-Yfv6S…
G
I just shared this on my FB page with the caption, "You will learn more about AI…
ytc_UgykoU2fo…
G
I wonder what AI these guys have access too - the couple I’ve used have produced…
ytc_UgxHcysWn…
G
AI can literally analyze the most successful stories/storytellers, study their c…
ytr_UgwTKkp1m…
G
The biggest danger is that these language models are owned by large tech corpora…
ytc_UgwUg-fMN…
G
Considering that AI art is still absolutely no threat to real art, I think that …
ytc_Ugx1EuwP5…
G
Well we already know the tech bros are scheming against us, so why would it not …
ytc_UgyloLHXw…
Comment
There is a reason why humans have emotions like shame and humility. They are there for us to not behave like monsters. AI/computers don’t have those. So just freaking STOP with developing these DANGEROUS technologies!!
Developing AI basically means you agree to the world being run by narcissists that don’t feel human emotions like empathy.
youtube
AI Harm Incident
2025-09-13T12:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzUPrrlbpENjit66xZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyLSkGxAQv8TLTgQ854AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzpaBbtk5-oZOzliPh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzUAaro-XLVKSpUS4J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwnVpN0B8bP8Or2VKx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwWXw-IKvTZq_ONn6B4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwLN1WFAd7iZq7Kngl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxMbeUK9D5KwjpsExN4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzrDYBui0s6iHtEfEh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyoLy3FJtHrX0THrD54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]