Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
While the promise AI to be everything for everyone is a decade away, in the mean…
ytc_UgwyhFzec…
G
They can have all the Microsoft paint furry inflation stuff lol. AI art has abou…
ytc_Ugw3Fet37…
G
It's like a perfect storm, a lot of employees have become high maintenance, hig…
ytc_UgzoV-zLL…
G
This is nonsense. AI cannot do human jobs. Unless your job is to create AI slop.…
ytc_UgwLHeUXn…
G
Problem with your overly long statement is that you're making it assuming that t…
ytr_UgxrcQFPg…
G
Me who is using a local model with a local LLM running 100% locally on my comput…
ytc_UgwptqTOk…
G
In other words, AI studied humans' internet and is getting all the answers based…
ytc_Ugz7SiMQ-…
G
💯 AI is already out of control and look at how much it's ramping up now.…
ytr_UgyBJNaFJ…
Comment
Why self driving cars are bad.
1. Hackers. We all know hackers will do it if it grants them an opportunity at money. By imprisoning someone in a car that they could make crash at any moment, people will give away their bank info to save their life.
2. There will be thousands of bugs in the code (that car showed one of them) and no you can not get rid of all the bugs, they're always be some there. A saying programmers have can help explain it "99 bugs in the code, 99 bugs in the code, take one down, patch it up, 117 bugs in the code".
youtube
AI Harm Incident
2018-04-02T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxu48WTenHCgOgQdwR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwLqzF1X1NEeHPH_C54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzv81s398PkyR_p_Wh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyMJgjlj3ONO3x1qBp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"sadness"},
{"id":"ytc_Ugy3Jvbtv1IIMc_Gb4h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwEaj_OvSl6JkcqRht4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwmsL2y5MiyftdfEfh4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwxaGE7TUjtUpSAjVd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgypSz2b-trrwPj1Wt54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwpojYDdejAB2qyjP54AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]