Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
These arguments seem to always assume all actors are good actors. What if AI is …
ytc_UgyuXF-ZW…
G
Generative AI has gotten so good its hard to tell the difference anymore
That …
ytc_UgwYx8P3i…
G
Men: "I only like natural women"
Men: thinks AI is a natural woman
Did you notic…
ytc_UgwqgJUib…
G
Yeah, obviously people shouldn't be able to make money off of AI generations. Th…
ytc_UgwlLlJA4…
G
Great. Yet another Silicon Valley grifter, but this one's been given the keys t…
ytc_UgxOF4Xw7…
G
I would not worry too much over what a guy who figures we live in a simulation t…
ytc_UgxlgwELU…
G
@Eisenbison if that's true then I wasn't reffering to you bro, I'm talking about…
ytr_UgyAtPeIB…
G
The “better education” argument has always been a crock. That’s how we’ve got an…
rdc_j3zdfme
Comment
1:22 the more you frame your high level understanding of LLMs from the mathematical, statistical view, the more you see the current problems as addressable from an engineering perspective. If you just hear convincing arguments and you don't go learn what is necessary to judge whether the thing you read has a basis in reality and is not guided by the if-it-bleeds-it-leads scare you into buying their book mentality, then you are at best doing yourself a disservice and at worst engaging in motivated reasoning.
youtube
AI Moral Status
2025-10-31T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugznx6Vrfa_ILXDDAmN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwIzZsIk9hou_DkG5d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyxuC2lR1DcVZvxeph4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx2WvPg2zwHagKEc_p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw29TXfU1-C6sJ4Iv14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhFUeHflYZB26QLxF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwP_OwAJj7ACUAxfkV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgziSIhT7JSsVAbovId4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxG34lc0Pl01TyzbH94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzpiCz-nk2S8FTrSet4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"})