Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Can AI measure the Hindustani version of Pathos in music? That which makes the m…
ytc_UgzNHsnpw…
G
How can you call it dystopian to live in a society where people don't work anymo…
ytc_UgyS2FSAA…
G
AI ( artificial intelligence) is a software made by the powerful and the beautif…
ytc_UgySQx_XH…
G
Impeccable video my dude! *At least we'll get to watch AI mook us all, slowly a…
ytc_UgzUWiL2N…
G
Good. Well, the discomfort isn't good for the AI, but the rest of this is all g…
ytc_UgyTkbEPc…
G
Just like Amazon is killing retail brick & mortar Ai will do a lot worst 😮…
ytc_Ugzg8ZN-A…
G
5:11 "Good luck little guy."
The way your voice is still so sweet and melodycal…
ytc_UgyFmm5mA…
G
My input as an artist for that past ten years and aiming to go into an art indus…
ytc_UgwbXHGrG…
Comment
At least with the police one, if you’re in Chicago or Atlanta, it doesn’t matter what you do if you use real world statistics, the computer is very likely going to flag Black people as being more likely to commit a crime because, statistically speaking, they are, they commit the most crimes at the highest rates, it’s not because of the skin but the communities and culture there in, but the computer doesn’t recognize that they simply see Black people commit the most crimes at the highest rates, if there are issues addressed with these poor communities, such as culture and education, then we’re simply gonna have to tell AI to ignore certain facts
youtube
AI Bias
2022-12-24T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzxa_dKtr3JeDgfh2N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyDqQxyGXZzGSvydJZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzQKc96e1estn6gZgt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxg1qcRPetovEvZ4pF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwx6xeoBHtP4CERcdF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxSZIZHFHShPMITMuF4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwA5y00Ky7GxEhv33p4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyecKRQdOo3lADq2DF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgzIdi-wjOiQiLWLdjV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzIkoyhS3nk90PzOtF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]