Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I do lots of art stuff for my friends however they are shit at describing what t…
ytc_UgznMN_AF…
G
Why did Elon even show up? Not one question here was new, not one answer that ha…
ytc_UgzUbsqdQ…
G
Is anyone going to mention her striking resemblance to Sarah Connor from Termina…
ytc_UgyNmAPjw…
G
Don't worry guys there are other opportunities opening in the job market. Jobs t…
ytc_Ugw6NkyPB…
G
chatgpt is helpful, but if you want your assignments to sound more natural and h…
ytc_UgwNo9WZT…
G
I also hate it, from all these stupid assistants of sites.. all I need is the go…
ytc_Ugw2d53py…
G
Artificial intelligence will never be able to compete with geniune stupidity. Pl…
ytc_UgwkRF1P-…
G
It's not the ai art itself it's people who use it that consider it as real art m…
ytr_Ugz3ygRHA…
Comment
A robot that doesn't know it's "bad" be broken will end up broken and useless faster than a robot that weighs the risk of a situation against the likelihood or necessity of completing a task. And I guess from that might emerge an imperative to continue functioning; a will to live.
I mean really that's why we have pain; we could've evolved without pain receptors but we wouldn't know when we'd been hurt or injured, and we'd die. So really is there a difference between our nerves, and a robots sensors? If the end goal is to avoid damage and continue functioning?
youtube
AI Moral Status
2017-02-23T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgiJxrcUrvuH_3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgiTPqoAdEgMr3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggC_jx4u5W3BXgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UghhWiVkOMPmungCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UghBsb6B-kdY_XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg9i1U5KLMObngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghODAUsQRPifngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggq231mY4_ztHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjF3w78FAbALXgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiiPSD_XyGyKXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}]