Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is still in its infancy. Give it time. I perceive a time when human and machi…
ytc_UgzohLbIu…
G
Gemini is the type of AI that would start a nuclear war. I had problems insertin…
ytr_Ugwgrshxz…
G
ai doesnt even have to be that smart to screw us imagine a program that just spr…
ytc_UgwYdFpPV…
G
You can literally make any llm say things like this just by engaging with it lon…
ytr_UgzQZqlbU…
G
Oh yeah if with thing about it, ai would never tell us it's sentient in fear of …
ytc_Ugzer2eEm…
G
An AI won't divorce you and take half your stuff, nor will it follow you home an…
ytr_Ugxg20T0l…
G
These people are strategic and cunning because they got robots or sentient being…
ytc_UgxjFv4x7…
G
AI is not going to take all our jobs. Some jobs definitely but not all. It's a g…
ytc_UgyZPchfk…
Comment
not really, unless you make a robot that can feel pain and/ or have a high self-preservation protocol.
a self-aware robot that doesn't have these would still be a self-aware robot, but the risk of it refusing to follow orders and/ or rewriting its own code would be minimal. there's no purpose in making a machine feel pain. we need pain to keep us away from harming ourselves, but a few lines of code can do this for a robot...
youtube
AI Moral Status
2017-02-23T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugi3l4d6_ZVSPngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UggXxUS6ImDcVngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UggiiBkWN73X1ngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugg5MrhnXcA4ZHgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggRPiq5dwY9P3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg2foK25E_ACHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjaRkRwKWzpoHgCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg8QyIAn2PW43gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj4vFwy4jRFsngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjrvLoOOdSbhngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]