Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm getting downvotted for explaining what imballance is but yeah i understand w…
rdc_lux2f8h
G
Reminds me a little of that macaque selfie copyright question: can a non-human e…
ytc_UgxeqgX8f…
G
@lelouchlamperouge5910 why wouldn't we see a boom in GDP of it was so transform…
ytr_Ugwleh07o…
G
As an upholsterer I just can't see a robot trying to hand cover a sitting room s…
ytc_UgxndH90h…
G
without a human touch art is soulless look at all the AI art it doesnt have that…
ytc_UgxXYo_LT…
G
@AlexSendokai2026 ok I know that but that’s still using someone’s art w/o permi…
ytr_UgwyiEZzG…
G
We ALL know this could happen, but people still made robots because well the con…
ytc_Ugynn3inH…
G
I'm very bearish about the state of the tech industry for individual devs but th…
rdc_n4cseft
Comment
People are already getting exponentially lazier and dumber. If AI starts doing everything for us, we can just sit in our easy chair all day and drool while watching TV. Heck...all of our muscles will atrophy and we will be lucky if we can even walk any more. LOL! That doesn't sound like a good life to me. Anyway, AI robots will most likely come to the conclusion that humans have no actual purpose on the Earth, and it will take us out.
youtube
AI Moral Status
2025-11-15T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw3rw6_lLhOwXK4B9p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxuAskHLSP92pCFrS54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyyeKd58GGPmClx3bR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzFj_JfsDPLl4pi2Wx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxOnxJYAgZ835BW1w14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy-NDgtC4ptswW6FDF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwZqMtdvVoClufwnsh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyymsPfdyYj6LuxL6x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugx_VDbNZWpD1h8Ypf94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgySs4RPmQrUs-nyGHt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]