Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
0:13 Humans also dont make art out of thin air
Usually you need a foundation , …
ytc_UgxcAZCF0…
G
This is trash and paid for. You don’t know how consciousness work or have you so…
ytc_UgxHyif2S…
G
WHAT IS REALLY NEEDED IS TO FORGET AND SHUT DOWN ALL STUPID ASS A.I. BOTS AND SU…
ytc_UgwWCfaR5…
G
I'm perfectly fine with people using nightshade etc myself, as the truth is... y…
ytc_UgzT7shuP…
G
AI art is for people who hate the concept of art but really just want to rub it …
ytc_UgzGOYIZD…
G
Gotta lol when back in the day, everybody said AI will replace many tasks, inclu…
ytc_Ugxr9XzvP…
G
idk if it's the pfp you're using, but frankly I like it. nothing needs to be per…
ytr_UgwPxnvbR…
G
I didn't know ai picture generator supporters have been whiney about their unwil…
ytc_Ugxz-Jgw3…
Comment
There should be a Action Safety Measure since AI only use facial expressions & actions to decide. People that are abusive or psychopaths / mental disordered AI Owners would feed the AI clouds with Negative impacting data that another AI robot would use in a different home. Also humans are becoming less and less human, becoming less patient and becoming morally corrupt on world wide scale this could create a issue due to the fact that the AI is learning from their Owners & Cloud and not from a controlled and domestic program. More thought should be put into this.
youtube
AI Moral Status
2021-08-23T08:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwrJtXjKCVEXJcAtUh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxxJ6U17OWS8tNjwv54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx8soE6Ykpr40ylGX54AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwYzC0kJM7dLYIoCe94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz7P8rH7-OTJx7-UN14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxiTwSt11XYZdoFBRd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgycyuZWYe4cMQmGVUJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx9im6_AXAP8Ggg-094AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyAhy20KSPaoDc6nDx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyMbs81vcsa_q1bzrB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]