Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I feel like people care more about either 1. will they be able to learn somethin…
ytr_UgyXdQ5gu…
G
I definitely agree with this. Yes it's pretty and all, but there's no effort or …
ytc_UgzGLACbd…
G
A.I. can never be smarter than humans cuz we are spiritual beings so unless A.I.…
ytc_UgwGW3rxL…
G
As i watched this interview, half the time I was thinking, "Is this AI generat…
ytc_UgzCKTZcO…
G
I love how everybody thinks Elon Tusk is so smart but if you really listen to hi…
ytc_UgwjHxfNS…
G
For me art has always been more about the story anyway. When AI can tell a reall…
ytc_UgyyGOhQK…
G
I looked up the quote "You can't have a nightmare if you never dream."
ChatGPT w…
ytc_UgxMrPD-w…
G
Can we all just agree....???....
that anything out of Hollywood has been from …
ytc_UgzduaSv2…
Comment
All rainbows and sunshine aside I would like to see how many incorrect answers for open questions AI currently gets. Currently, we see a decline in LLMs quality for programming. Over 50% of answers are incorrect. But lvl of confidence is always high.
youtube
AI Harm Incident
2024-06-03T05:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxyPffczbWOGk8Zoe14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzB9RmLiKshpQkR1nV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzBhqweGX_F9veRiPZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy2PkXIc5vSCAlR3ah4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgynjVHVGYNuplpPvlZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyGzjUWlc4hW7LsWdd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw3KZl7iVFkVZO9uiN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwEhVvKVlSpuycaGn54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyN9U7c4NsBrvNimTJ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyNHmrQ-0MU7t16qSV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]