Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I see you found the dialogue amusing! 😄 If you enjoy interactions with AI like S…
ytr_UgyEVY0cN…
G
Conversely, I need my Ai to point out my mistakes, not aiding in my confirmation…
ytc_UgyGvQulg…
G
it is not nearly as valid lol youre just upset he isnt defending your laziness. …
ytr_UgxtzBmUU…
G
Some of these guys are either delusional or ill intended if they don’t “see” the…
ytc_UgyhgkvmC…
G
We at technolgies future which belong to Gary Yearwood are re building a auto ma…
ytc_UgzMd-pRI…
G
Ever play Metal Gear Solid 2 Sons of Liberty? reminds me of the GW AI.…
ytc_UgyfJPLdF…
G
So now we can't even talk on the phone with people. Why does AI have to be forci…
ytc_UgyZd4CIB…
G
Who is still believing his BS ?! His universal high income is only for the rich!…
ytc_UgwcCnV-f…
Comment
As far as I'm concerned, the only way to have consciousness is to have feelings (all creatures do what they do because of how it makes them feel, it's just some do what they do because they can predict how they'll feel in the future, so will put up with pain in the short term for gains in the long term). Furthermore, as far as I'm concerned this is the best way to make machines which are capable of adapting to the real world.
Essentially, I think feelings/emotions are a requirement for sufficiently advanced AI. I think any AI which has feelings should be given rights. I also think we should consider that when building such an AI, as it's theoretically possible to tailor their feelings to the environment they've been designed for (just as evolution does for organic life).
youtube
AI Moral Status
2017-02-23T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgjpHbD1cb_bGHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgiwpEgnkVIjz3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugj3v0gqenbpS3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgiMAV2WUQbo3HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgjPO1aWk3kRM3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgiaWn-BMIFxdHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugjesjn2d2Is3XgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjGnJ_vguQsu3gCoAEC","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"unclear"},
{"id":"ytc_Uggz-8DSC64i2XgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UghST1ICt0Ozk3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"})