Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nope. AI isn't suddenly something new. Algorithms are fragmented AI. Back in …
ytr_UgwPyaocT…
G
I use ai as reference and they look too similar have the same issues 😂😂😂…
ytr_UgyKcL0L7…
G
$500 a month is not an income it won't provide enough to cover anything other th…
ytc_Ugzze9zsW…
G
@hilburn- human artist can pay compensation for the references they use or just …
ytr_Ugwd710Ae…
G
meme in AI "art" community: "nice book, too bad it was plagiarised from the dict…
ytc_Ugykw0rPO…
G
This guy may have good knowledge of AI, but his knowledge of the potential of ro…
ytc_Ugw2O7CFC…
G
Listen, AI needs to release the Epstein files cause the government obviously isn…
ytc_Ugy7kIenX…
G
I think AI still has more to offer once they fix everything, but we are definetl…
ytr_UgzcZGN2c…
Comment
@zelle8651 if you want to give up, you can. If you want to fight, it will delay the inevitable. Whether AI takes over in 2 years, 10 years, or 100 years is still up in the air.
youtube
AI Moral Status
2025-06-27T08:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxetQysDQj9_Q2xckF4AaABAg.AIxPYFx4QiYAJp3ryBuMn_","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugw4nzYNIQEQ5juTJBx4AaABAg.AIxOmVudEl4AIxaYDix11J","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugx3iYiKFHOGB5EjQ_J4AaABAg.AIxN5TEdFlJAIxcC2e9iIM","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgzSTGvcRHkkbjtzo6d4AaABAg.AIxMvz7yH0oAIxNOcl55dn","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugxq7-j9ziOIIivmij14AaABAg.AIxMW61t5GJAJZWcxYU07I","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgzWujMYveyHzHSWLbZ4AaABAg.AIxLTWqGG5hAJrulJeb95H","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugz_jQzq2Hsc4FyqE3l4AaABAg.AIxLPr4JaLIAIxOM1AWydp","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_Ugz_jQzq2Hsc4FyqE3l4AaABAg.AIxLPr4JaLIAIxRPKpM2MQ","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgxEIlC-DNKKPe6pwd94AaABAg.AIxLKKl28qPAIyzJmsOx_h","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugz5Cmz2kdC5ml_KXS14AaABAg.AIxL8X-jkPdAIxM_wAy4Wn","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}
]