Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
About the self driving thing.
That's a Tesla problem only.
I'm not trying to d…
ytc_UgwuUE275…
G
putting copyrighted Art and images in the database without the artist's consent …
ytr_UgxXwxR7_…
G
Boy oh boy they rub the fact your broke and sold into a system of slavery right …
ytc_Ugy8GTb_j…
G
Disabled artist here (various nerve issues from a childhood brain tumor; I still…
ytc_Ugw8gZant…
G
The population of the world should be making rules for continued growth of AI be…
ytc_Ugy3O4jJh…
G
ITT: republicans bad.
Can Reddit go 5 minutes without shoehorning their politic…
rdc_jxz7fkb
G
I wish AI could be something great like it could be. No one wants to talk about …
ytc_Ugwb3cR_a…
G
Thats why i dont like self driving cars
It could litteraly drive of a clif and …
ytc_Ugxhm7ukg…
Comment
When people say “AI is going to replace us” they tend to picture scenes out of ‘Terminator’. While that’s not necessarily an impossibility, the reality is so much fucking dumber than fiction.
We’re already starting to see AI replace humans by taking over human jobs.
Then humans will be kicked out onto the streets, starving and dying, because machines replaced all/ most basic manual labor.
I told you it was fucking dumb.
Dumbest of all, it’s a human-caused, self destructive problem, easily fixed by the humans causing it, but they won’t because they’re fucking stupid and greedy.
youtube
AI Harm Incident
2025-09-12T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwhLjx5jDAq41z0KAN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwm5UoiBq9KXbwDx5x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyMqGLWLGSSdWmT4yt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyi0Z95my4NJoBH8xp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy4tkUqG_DHiyoUWoF4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyhVqDlod9I__-A3Yd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyPfce0sI6rU2FmF4d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy_6XpQmk9-pswQJjZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzJrmiOTQPTTdLrvLR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxWFIQbTv_wF7ctOVB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]