Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sam Altman got fired over DEI policy implementations, not over some "i will kill…
ytc_Ugyu8BR9E…
G
AI means bot software automation and automatons - robots. Of course - industry a…
ytc_UgxYdYKqt…
G
bro i spent SIX HOURS TO DRAW ONLY THE FACE OF A PERSON FOR SOMEONE TO COME IN W…
ytc_Ugw2-h6Fp…
G
As an artist with a webcomic, I don't care about AI art. It will never be able t…
ytc_UgwLYo_aj…
G
Gonna guess that this will be filled with humans doing their best *scary, rogue …
rdc_o3gsqd8
G
@Chicago48Barely and it adds weird texts. It is just a glorified editor for my e…
ytr_UgwJUuAwb…
G
Ehh.. not really. I think it’s best to avoid ai “art” all together. Any ai gener…
ytr_UgzL6MuE0…
G
The ChatGPT cheating is so bad at my college (masters program). The weekly discu…
ytc_UgySqI_OO…
Comment
The key problem is that AI is learning from human generated data sets. So in effect it holds up our own biases and magnifies them back at us. although different approaches to learning can get around that, it is still an open question how we get even something like a language model to tell the truth, since we don't know the truth of everything ourselves. Most likely we create AI's that do what they think we want to see not what we want them to do. small difference but an ever so important one.
youtube
AI Bias
2022-12-24T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwSllbU4Z4ADIZ9rid4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzQ1l0zYxOA9euRTxF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwPlavmnTjotP6TH454AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzeoF8bBA-l2cdtlAZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyD8L-YKze86G7qkU94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzkoncLwMvG1xD6oxB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw1tEgHzryYlVz7hyF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzqIg8yrcVS83RxoGp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwFy4OsDIXNkyBM8IF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw0p3wV8sUTUpHAfnh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"}
]