Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@dansin468 Wow, you really have that low of an EQ. Don't worry I won't pounce o…
ytr_UgxBpnaLd…
G
To your point, all AI is is a program on a computer. Some people think that it'…
ytr_UgyuhGnnw…
G
Narayanan is like a religious leader... "But you need humans for..." Consistency…
ytc_UgzY8R4WW…
G
My latest AI downloaded from Googleplay is astoundingly good. The one thing I've…
ytc_UgyM9uGrw…
G
Telling that this approach means he believes he has nothing to learn from meetin…
rdc_oh29icz
G
I was at fan expo this year and was so disappointed when I saw all the ai slop. …
ytc_UgyPJNRfd…
G
No no, i totally believe that his plans are correct, he was hoping to be dunked …
ytc_UgxFb-ITZ…
G
Humans know how to take inspiration and not just copy other persons piece, AI do…
ytc_UgwXSMue2…
Comment
As a software developer, I think I've always had more reasonable expectations of AI from the start. I noticed I was a lot more excited about it when no one heard about it or cared about it and that I've been a lot less optimistic about AI once all of the corporate hype got underway. There is a reasonable factually grounded baseline of expectations. Yes, we have a new technology. Yes, we are now capable of doing things that we were not capable of doing previously. That does not mean the fabric of reality is about to unravel.
youtube
AI Responsibility
2025-10-06T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwSau22Dkd0-fYZDsV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy3U4sFq-CpvKQ4Kc94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxmU7PDi3bfRv5wlDR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy53aztPha-gn5hbI14AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugx0mskdlljKknO1Sl14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyzbMJLkU3CbkU8pSh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzE9SS8Hbxn6SoTd6F4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwKv6ZoXSr3zGAxaYt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxjd2jpRCx4Q2NBcsh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzCLlisXoAp1sEjULF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]