Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The narrative of a few people are controlling the direction isnt as true as they…
ytc_Ugzy-F5Ur…
G
Agreed. Using AI as a "reason" to eliminate people, mitigates lawsuits (whether …
ytr_UgyEVkmgw…
G
Given the state of American medicine, diagnostics is NOT their strong suit.
So C…
ytc_UgyQrewUO…
G
But why should I bother reading the blog of that company then. I could just ask…
ytc_UgyXLIEX1…
G
The first part is not exaggerated.
My colleague had to mark an essay which had …
ytc_UgwRYWuBD…
G
maybe ai can eventually learn to detect prostate cancer or anything inside my va…
ytc_UgxArn3BU…
G
Thank you for your comment! If you have any questions or thoughts related to the…
ytr_Ugwu04J01…
G
I used to think this but have you tried ai in the past year? It's gotten really …
rdc_moommqq
Comment
lol people are freaking out over AI for no reason... this is one of those technologies that levels off quickly... kinda like the airplane did. If you'd been around in the 30s, and 40s, when airplanes were really starting to catch their stride, you would think that by now we would be rocketing around the earth in minutes. didn't happen. technology got to a certain point and leveled off... everything points toward AI doing the same. we're not going to get AGI, at least not with LLMs. yall freaking out are gunna look back in 20 years and laugh.
youtube
AI Responsibility
2025-07-28T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxB-9kHQwbb7fHG2tx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxUxNG7cTPQbe9QnS14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzVAOcuRgaFEGIiD3R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx7qXS-QdBP6CbnljR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwdqz4enlNC3-_l9NV4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"unclear"},
{"id":"ytc_UgwcDNTd8ARakHq2EqN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyl796yAI3y8VaX1N54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwyMCHd1mFe_BB4YCl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxhpL5Qu7cHfLBURaJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxyrii6Rx67PJQ8dG14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]