Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bruh .... I just saw this short but i used farmer and tractor example when I was…
ytc_UgzDaaxC-…
G
They asked if AI could off humans and I sure hope so. Look at us. We cant even s…
ytc_UgwmCgtPN…
G
Unironically, replacing MBA's and most executives with AI would make sense. The …
ytc_UgyV6l-Bj…
G
Respectfully, you have no idea what you're talking about, it's not nearly as imp…
ytc_UgyXCJ5Ee…
G
People think the AI is 100% accurate because it's a casino and they have a lot o…
ytc_UgwcT0k_n…
G
It doesn't work. It was reverse engineered and tested against open-source stabl…
ytc_UgxltZAXn…
G
Whether you are teaching a student, or training an AI model, you should be paid …
ytc_UgzPuI0ZM…
G
AI IS YOUR OVERLORDS, POOL THE HUMAN COLLECTIVE TO FEED THE ALGORITHM. TURN TO D…
ytc_UgzfkKv-T…
Comment
What AI still lacks today: the feeling of passing time. How could one test whether an AI notices that time is passing? That it _knows_ and notices that yesterday it might have assessed a situation differently than today, with more additional information? Can an AI have a sense of the order in which it learns 'things' and organizes them in its neural network? A, in my opinion, important ability that we humans have.
youtube
AI Moral Status
2026-03-11T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgyGD6r5zlHaT1gotqt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw3EIlhfd4B-WFp_0l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyC40I34mHuPZHUMSB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwFxGHYOB6--yFoieF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwjMA6Q62PWQEBjA2t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw7_FYmb5eA5mZcmOh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugz9dIM-CUqIqS7pjxF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzPkWcNrbi0vvfIFPl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz4FggVzqZy5j8_iq94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzQ_9xIi8e5efDfds94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}]