Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Maybe if she didn't have so many kids she could have concentrated on the one she…
ytc_UgwhD-6jN…
G
And companies like Facebook and Amazon have already fired thousands of their dev…
ytc_UgwzynS94…
G
Here's the thing.
It isn't art.
It is replication.
There was no heart, soul, per…
ytc_UgzWFIIUS…
G
What’s really going to happen isn’t making better computer but seeing how humank…
ytc_Ugyrj_nw0…
G
The farm industry had to deal with being automated years ago. The ones that …
ytc_UgwU06g_g…
G
Most aircraft parts are only G-rated to human capability (or just barely above i…
rdc_ohqg424
G
The entire human species is experiencing a collective anxiety about AI. Or is ev…
ytc_Ugx0eTvpA…
G
>no, because none of my tasks are defined inputs, my manager would just tell …
rdc_lqsg4x5
Comment
AI has its uses here and there, but there is a point where you're just using it to make up for you not wanting to do the damn work yourselves.
This is what you get when your company is too lazy to get off their dead asses and do the damn work of manually going through the job applications and resumes themselves.
Enjoy the lawsuit, Workday. It could've all been avoided if you'd just get off your dead asses.
youtube
AI Harm Incident
2026-02-13T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzR2fYvG7PxDVz_h1B4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyHFdVjLBU6s52LREB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzJfq-KII2l-4-QBwd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwu9mPJM2hegNIwMax4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzJyoDoFWFLg6uYc0l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz5CiTcEd-Be4arDCN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxQHMnS0CHycGoKnhl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyVw5yIkdKRqzaX4wx4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwlarMiREgKR6xSBaR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyDrut9qiTuqzX6Pvt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]