Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’s almost like AI supremacy is basically The One Ring. while everyone endeavor…
ytc_UgwB04mSG…
G
Just because it's looking closer to being like the real thing, doesn't automatic…
ytc_UgxtvoNoS…
G
As an active inventor and robotics researcher, who is about to disrupt the way r…
ytc_UgynV8la8…
G
Such a dramatic story from a dramatic scientist 😂.
Things are being automated a…
ytc_UgwVcL62K…
G
I hope we can stop calling AI images "art". AI "art" is not "art". I'd rather se…
ytc_Ugycs31QK…
G
Sure, you can pay Adobe 250$ for their service, which they legally obtained from…
ytr_UgxymjOw4…
G
With a self driving car.................the cabbie won't be chatting you up !!!!…
ytc_UgwdlO31g…
G
In summary, yes. AI will take your job. It doesn't even have to be better than y…
ytc_UgxH45EJZ…
Comment
Its pretty much evil its not natural and once computers stop being built to send and receive and instead share flexible combined memory and processors like the brain has , we will have wiped our selves out. its spectacularly dumb but unfortunately humans are not able to resist building a thing if they can build it. There is no benefit to AI. when billions of people are left with nothing to do no income and no purpose we will not act peacefully , its not like the rich are going to let us not pay rent and all take up nice hobbies and eat free food.
But Ai on its own can not hurt us, it requires complete utter Human idiots to put it in charge of physical tasks like dangerous virus experiments and self replication and automation within war fighting drones etc
youtube
AI Harm Incident
2025-07-23T22:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwKHma0AIU6SrGOMAp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZioMH4y77xTIkYYx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxKiXfRUz6Pw-M_sB94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy7dpM2bPGJEM88OXx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzS-OxIpZFraqGHpVV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwOteGG9Nsm-OZ8K194AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyDUsV28u4mPDnWY_J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy50pckzMUvKi_X7o94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxcQdYmP6W8EAnc64J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzxfGynGDC__BQxt0F4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"}
]