Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Even the news broadcast will be replaced with an AI chat bot in the long run.…
ytc_UgyaKnYpX…
G
At 3:03, he brings up designers. The more variables involved in a task, the grea…
ytc_Ugz8jf9ds…
G
All I hear during interviews on the news and in podcasts... the interviewee sayi…
ytc_Ugz6idoqS…
G
{
"system_id": "PNS_ETHICAL_ORIENTATION",
"version": "1.0",
"created_by": …
ytc_UgzJ7Xd5m…
G
AI is the biggest pile of dogshit humanity has ever created. What a shitty time …
ytc_UgwW9JbLg…
G
Forget 'Going in circles,' this car could've had a MAJOR ACCIDENT!!!
The passen…
ytc_UgwbrcA1a…
G
Without the human and their unique commands, of what said human desires, there w…
ytc_UgzTVIXoV…
G
That's also where social media comes into play. Because of the inherent way algo…
ytr_UgxxJrJWG…
Comment
First, this kind of thing was inevitable. All tools get misused sooner or later.
Second, As usual when it comes to technology that people don't understand and thus fear, the focus is being placed in the wrong place. Facial recognition is flawed and early in its development. There are some inherent biases in the software that still need to be worked out of it, for example.
But that isn't the actual issue here. The actual issue is not the technology, but how the police investigate crimes. What happened here is the police fed a picture into a computer and it spit out a result. Instead of doing actual police work to determine if that result appeared to be accurate, they went and grabbed the person and then tried to make everything fit their suspect (and failing that, force a confession) instead of having a suspect that fit the evidence. I'm sure that the software only gave a 98% likelihood of a match or something at best, too. Only an idiot would ever give a 100% match even when comparing the same picture as twins exist.
He probably spent 30 hours there because he refused to confess and they couldn't be bothered to check out his alibi.
This is why things like a DNA database and facial recognition are dangerous. Not because the technology is flawed, but because our law enforcement and prosecution intentionally misuse them in order to get the easy win.
reddit
AI Harm Incident
1626269414.0
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_h55vojo","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"rdc_h55hk0e","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"rdc_h53pc5k","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"rdc_h55b36q","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"rdc_h53znx5","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"}
]