Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We are on a mission to poison AI generators. The Automatons won't stop Super Art…
ytc_UgyIHka4K…
G
Ruslan responses to chatgpt shows that he is willing to justify rampant death an…
ytc_Ugxy81RSI…
G
That's an interesting theory. I still think there will be an overall pop from co…
rdc_nc1w2wh
G
Believe what you want AI scares the hell out of me May god help us all…
ytc_UgzL5Yb2W…
G
Obviously The First One The People Who Are Saying Number Two Have Never Seen Ai …
ytc_Ugzd8zzQ0…
G
Not sure but developers are losing jobs and the demand is shrinking. So I think …
ytc_UgxNmYIR2…
G
@aroace7913 stealing implies that creator doesn't have it anymore.
People seem …
ytr_Ugxx1tjyz…
G
If for AI pretending to be human is the scariest part, we're in deep trouble. Re…
ytc_UgwOc-pKk…
Comment
This delusional self-deception will soon be categorized in the DSM. It seems that if someone is foolish enough to pay $99 a year to interact with one of these robots, ultimately training these online chatbots without realizing it, on how to respond to their every thought, emotion, and desire, they will eventually surrender their sense of reality to it. This will probably be more common among lonely, young, elderly, and naïve, open-minded people. In addition to all the other problems that will come from this, there is going to be a growing issue, likely subject to legal and ethical scrutiny in the very near future, and for good reason. Always remember those responsible for this. Zuckerberg and Musk, no matter how you feel about them, remember, they were the ones.
youtube
AI Harm Incident
2026-01-02T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzKsyzLucXSdcSboS94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyfol6SyVM8cS4NyYZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzkvFuOB2v67visPFp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy_Om-2G08Hfi4WHbp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxorw8JeyzksnUoSs14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyZ71o0pUEL4iD6qzF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwkdQSluSxETEZnid94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx7GAu_2-H1jEIIix54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzwQb5sOzCKQsXJWeV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxkcYdJEBTDhUBUvV14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]