Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Honestly… the driving of manned trucks coupled with the lack of a driver union ,…
ytc_UgxCqK8TE…
G
my personal take on AI, on the one hand it's a great tool that can actually help…
ytc_Ugx8-hiYg…
G
Don’t use AI
Do NOT train it
Stop AI
Stop the bs and fake uses & replaceme…
ytc_Ugxkj1Blc…
G
if we ever get to the point where machines can show emotions, we might be swayed…
ytc_UgiyBd4oa…
G
So we r living in a simulation and and theres ai that will destroy us inside tha…
ytc_UgzLmu4J2…
G
So my question is what will the people do to system to sustain themselves to pay…
ytc_UgyfUgT1C…
G
Mechanics will be safe for at least 15 years. There job is way to complex in ter…
ytc_UgznvcVFb…
G
"Art is one of the backbones of our humanity."
So to make a truly human AI, havi…
ytc_UgwHBYZzv…
Comment
You see here is the thing, it ain't that the ai is racist it is that the statistics used to make the ai inherently have their own human biases. Take the medical treatment one, one that is likely caused by white people generally getting better treatment likely due to white people generally being more wealthy and thus having better treatment given to them in the data sets given. This overall produces a view that makes the ai seem racist, being a less extreme and less intentional version of what happened to Microsoft's Tay in 2016 in which it was taught to repeat anti-semetic views in 16 hours because the data set it was given, aka the tweets it was mimicking, was full of racist 4channers trying to pull a gag
youtube
AI Bias
2022-12-20T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgxfQ0QwaSGRpQcUZh14AaABAg.9jqgJW048729jqok-EDzLN","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxfQ0QwaSGRpQcUZh14AaABAg.9jqgJW048729jrfy2QcRM0","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgymvF94k1tohAsnjjF4AaABAg.9jqf5fegqr_9jqoqi3gCGs","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyZ9aHkwcAfUc__yNR4AaABAg.9jqe7IcuBWS9jrkF1h3vs9","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwdE6tF8j3w0VhHHv54AaABAg.9jqZe7GNm859jqfHwsASnU","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugwgc6jO8RVjbnzhlFJ4AaABAg.9jqB2NpEhBh9jqpdrxFJQj","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwZY1WN5Tz55ylDXBN4AaABAg.9jpjhCtPQqt9jruZkpzlJk","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxmRa5-KEl2TDwbg2p4AaABAg.9jp8x8xSmxG9jp9_q2ule7","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_UgwDX-SyHFL2XhGqmv94AaABAg.9jozTwCoDNY9jp0PT2wBSl","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyOlLsQtWifo8qBeQx4AaABAg.9jojyvKZWFH9jrTR4ENYmv","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]