Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Robot:*gets stressed out*
Human:*peacefully walking by*
Robot:*breaks the fourth…
ytc_Ugx1uZJsI…
G
Don't waste your time with ai. These machines constantly make up nonsense from t…
ytc_UgwgbCnAw…
G
Two things are certain in life, technology will advance ... and someones gonna f…
ytc_Ugwd1Wcxa…
G
Lol as an avid Stable Diffusion user, I get none of this silliness. LLMs are a d…
ytc_Ugxibr1yd…
G
Disabled fiber artist here. One thing worth mentioning I think is that art and c…
ytc_Ugw2aCCZK…
G
It's not even AI from what I can read. Daz 3D is not an AI tool, it's a 3D tool.…
rdc_lu7h4ha
G
I love your little anti-clanker punk painter girl here. I support her in all of …
ytc_UgxDKwY8j…
G
AI should be used to do what humans are not [yet] capable of, not replace us. Fo…
ytc_UgxSSxDxx…
Comment
Some things like this however prevent other problems, for example if a robot doesn't fear death then what is to stop it from killing it's self, if a robot can't feel pain then what is to stop it damaging it's self beyond repair.
Questions like this make you wonder if it's even humane to create conscious robots in the first place.
youtube
AI Moral Status
2017-02-23T19:2…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UggAeEyGOWLwAngCoAEC.8PKrFrKHVKP8PL23LM5COY","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UghOZvHHUFgJ13gCoAEC.8PKpyAJodvg8PKv-x5RGGb","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UghOZvHHUFgJ13gCoAEC.8PKpyAJodvg8PKwpTqLs6F","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytr_UgilbiiByK6t2XgCoAEC.8PKopNCybwI8PKsE8YAn-m","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytr_UgjSV_17zxqnKXgCoAEC.8PKo7A43RufAAt3SmiS0oL","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugg3Gq3iNadmtngCoAEC.8PKnw7QktlE8PKyLS3rwGm","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugjj8IHpZfnxuXgCoAEC.8PKnTtN5daI8PKxLoX1UPh","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"ytr_Ugj55TnUfAY1t3gCoAEC.8PKnOLstUyW8PKuyC3D287","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugj55TnUfAY1t3gCoAEC.8PKnOLstUyW8PKvOoweAW3","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_UggIk_cdpwuOu3gCoAEC.8PKnLp4C2128PKvHvqzlOi","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]