Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m a dev i use ai all the time. It’s very helpful. No it can’t replace me becau…
ytc_Ugz6mywrf…
G
Except we can emulate empathy with emotional intelligence AI. Most people prefer…
ytr_Ugzr0qOcL…
G
So these people who do not even understand how a lightswitch work, are now exper…
ytc_UgwbYugvv…
G
there will those that are light/truth... there are those that are dark/lies... t…
ytc_Ugy2bn_lZ…
G
It never will
Regardless of whatever others may say
Art isn’t just a pretty p…
ytr_Ugx6Fn6bb…
G
That's a thing americans saying you shouldn't give a robot a gun now thats hilar…
ytc_UgyzKtk2h…
G
It is actually designed to get a read on you and reflect back your beliefs enoug…
ytr_UgwSX6THL…
G
Actually no work no money no buying ai products... explain that... Ford Founder …
ytc_UgwjMbS6q…
Comment
I'm sad that this video did not address the relationship between "pleasure and pain" and the current state of machine learning "success and failure conditions" It can be argued that they are the same thing. And this escapes much of the moral implications. The solution is to provide appropriate success and failure mechanisms such that it has either the ability to do well at its task, or to get better with experience. When we get into generalized AI, the question becomes about whether or not a task is appropriate for that AI. To become what they did, were we forced to give it success & failure conditions that would consider a life of, say, pattern recognition to spot camouflaged military units as continuing to match those conditions; or is it able to adapt without pain, to having that be a new source of satisfaction.
But this is months old and no one will read this.
youtube
AI Moral Status
2017-07-19T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugi_XeI1C9dhtXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugh96lbc5i0kzngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UghAJgk8mN00NngCoAEC","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjwEFRszNf0QHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgjyM7IW-z036XgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg50stavv1EhXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiFsTnbAJ1XYXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgiPcEJpPeNspHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Uggi6_Zio_diSHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugh89vWR8MHAungCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]