Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yup! Just commented on how confused I am about this. When I was in Havana I reme…
rdc_f9ec78x
G
Most of the customers in my cs job don’t really know what they want. I think an …
ytc_UgzcTUQA7…
G
The main problem with this isn't AI, it's art itself. Art and who owns it is a c…
ytc_UgzNJyODg…
G
Why do people create robot and AI? because we need labour to do intensive and da…
ytc_Ugx10Sywt…
G
LOL if ai is gonna unemploy everyone why is unemployment so low? nothingburger. …
ytc_UgwUcRQFu…
G
I love the ones comparing AI generation to being in a wheelchair, in this analog…
ytc_UgzrDLlWy…
G
very true. i am a software developer and i heavily rely on these ai tools to wor…
rdc_n3l6yax
G
I'd say yes, if an AI can basically think like a person then it becomes a person…
ytc_Ugwu2PuzX…
Comment
Pythia Brixham Well, the first step towards sentience (imo) would be the development of basic self-preservation protocols, which would be pretty useful in a robot. I mean, you wouldn't want an expensive machine to just let itself be destroyed if it could easily save itself, right?
To properly do that, you would need to program it to recognize when it is in danger or malfunctioning and take steps to fix the problem. At that point, they would have something similar to a sense of pain. It's kind of ambiguous where it might progress from there, but those first steps are pretty reasonable from a design standpoint.
youtube
AI Moral Status
2017-05-14T18:1…
♥ 25
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UggLATWm7zy_1HgCoAEC.8RNh-2LC0dq8SYmwOU1pqF","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UggLATWm7zy_1HgCoAEC.8RNh-2LC0dq8SvnX3_NM1H","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UggLATWm7zy_1HgCoAEC.8RNh-2LC0dq8TZaylAKGpF","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgjeKkhTiv7Hz3gCoAEC.8RMHCXv3sjC8SxTzSCYhno","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwCnJtFPeuC9fR76Z94AaABAg.8QfMAafCrSj8QfXbNeFMQT","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytr_UgyNyx8NSktHm15PmgN4AaABAg.8QbkmPQfTQe8S9pXzDQEM8","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxAkrhfdggp5M7Mml14AaABAg.8QaaGzrgkbL8QkkFcgAScE","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgxAkrhfdggp5M7Mml14AaABAg.8QaaGzrgkbL8QqWE8BZVtF","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxAkrhfdggp5M7Mml14AaABAg.8QaaGzrgkbL8QravBfDonJ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgytjKg_utyNcZ2NWTt4AaABAg.8QUuG_AclmP8RK344ChpwN","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]