Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Initial Q&A was fun.
I am hopeful AI will generate ( for a fee ) term papers …
ytc_UgzrOMDH0…
G
This topic is so relevant! I’ve been diving into AICarma and it’s helping me opt…
ytc_UgwQwuXX7…
G
I don’t call ai a “tool” anymore when it’s stealing jobs. That’s not a tool that…
ytc_Ugyrl8Fjo…
G
current third year, mostly interested in rads. I am of the mindset that AI in it…
ytc_UgyJHqTZX…
G
AI coders produce bugs for sure. I produce 5x more bugs and take 100x more time …
ytc_Ugzlkng19…
G
Self-driving cars are supposed to have a cyan-coloured headlight and taillight t…
ytc_UgzkK-yWR…
G
I get why you might feel that way! The idea of AI and robots discussing wisdom c…
ytr_Ugxl4jIkM…
G
ai is used to cheat on homeworks because kids are not interested in doing homewo…
ytc_UgygHTJ-0…
Comment
No and here's why. Artificial intelligence will never truly reach sentience, but it is capable of achieving a semi-limited consciousness.For true sentience, AI will require an almost limitless power source to maintain it, which could be possible, but at the same time, using machine learning will only limit itself to everything it has been fed up to that point. Upon learning new knowledge it will be unable to categorize it unless specifically coded with that ability which is impossible since in order for something new to be unrecognizable it must not have been coded for and would therefore result in error. HENCE, the answer is no. While it can learn from its surroundings, AI is simply innovative, and not inventive due to its limited parameters and dependency on perior knowledge to obtain new knowledge.
youtube
AI Moral Status
2023-11-01T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxPWDP0Is_wYltMk8J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzNenuyGGwouuD40TR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwMQxii1QJfLjFn28N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxazTWjltDpPlXh4tB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzt-c93e-OEKRip4SN4AaABAg","responsibility":"media","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxFVLkzuedZ42hvsdZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwsl3Oq0mXWOh_F8tV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxKfstgH9RbJRPvPJd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw9AdwpyhxmFWX6TS94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwIbECxZnMOBWc_DWF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]