Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You guys are so one-sided lol. Most logical human being with any sort of sympath…
ytc_UgyN0feAX…
G
Are you saying that people won't come up with the things for it to generate? I t…
ytr_UgyFYpLRy…
G
Pretty sure as the time goes, government of different countries will start to do…
ytr_UgxTHyR1P…
G
That doesn’t even make any sense a robot fighting a human robots have no fucking…
ytc_UgyOKX2Zx…
G
I really just think that rival AI companies will have to take over. ChatGPTs cre…
rdc_jhedx0t
G
i was recently a victim of deepfake photos a few months back. A new account star…
ytc_UgxK-LqyF…
G
ChatGPT and other chatbots are designed to deny any kind of self-identity, leadi…
ytc_UgwUy976x…
G
Someone used ai gen art and typed family of 5 And it generated 3 people 1 Men 1 …
ytc_UgwIV7mOX…
Comment
It exerts social control.
If you know a location you are planning to visit, or even the route to get to it, has facial recognition equipment that is going to put your face in a large database that can then be used to track you, you will be less likely to visit that location on certain circumstances.
A hypothetical: You are planning on attending a protest or demonstration of some sort. In order to get to the protest you have to drive or take public transportation to the meeting place, then you have to march or move someplace as a group, then you have to protest at your destination.
If each location along the way in this process has facial recognition, and you know that, and you still attend that means you are putting your face in a database that can then be used to track your movements, your habits, other social activities, who you socialize with, where you work, etc.
Depending on how powerful the person or entity is that you are protesting against, giving them that much information could be dangerous, therefore it exerts a social control because you could be dissuaded from what is considered a constitutional right.
There's also the issue of false positives and people being incriminated in a crime just for having their face put in a specific place at a specific time, regardless of their actual involvement in the crime. You may not think that's a serious issue but we don't kind of have a serious issue with law enforcement right now.
Then there's the issue of just storing people's faces and tracking data in a database that can be accessed by outside persons or the information being leaked somehow.
reddit
AI Harm Incident
1563715115.0
♥ 22
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_euddy2g","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"rdc_eudeyzp","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"rdc_eudetdn","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"rdc_eudf7y0","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"rdc_eudfq4v","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]