Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m thinking about how production companies and publishing companies have been t…
ytc_UgwRTUBYY…
G
Also, I had to purchase the truck from the company and they will repo it if I'm …
rdc_jgsgnzo
G
law is absolutely undoable by AI unless it advances to genuine human level EQ. y…
ytc_Ugw2fD708…
G
Looking at the state of the world right now, I think it’s time to listen to the …
ytc_UgxGmuYSL…
G
I'm convinced people like Taylor Swift have already been using AI to compose mus…
ytc_Ugx5AnAs8…
G
The subscription cost for an AI image generator definitely makes it less accessi…
ytr_UgwBDtf6d…
G
I only ever used AI for coding once. And all i did was look at the AI code and i…
ytc_UgxVBGfNb…
G
Orrrr … they are desperately cutting costs to pile more money into the “one more…
rdc_oi1vxix
Comment
As money is not a natural law but a human invention, all the social implications which come with AI rising, could be solved in an instant, the question is, do we as the human race really want to solve it? Are we psychologically advanced enough to overcome our primitive past or not. We as the human race have the potential to do so but will we? I think that is the only thing why we should be concerned about AI evolving.
youtube
AI Moral Status
2026-03-01T06:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzHRerdztEfbxVwzp94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy8WC6Ga2hTc1XWSLJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyzhmwMjZKNEAlyJn54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzyi6rGX5WjEVJwgix4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzItwd5phga2CZZ1qZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyQhKze8Ue9ng7UgX14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyZIHf8JPWBSGS2bP54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxKXdE5ekjTgPyLhXN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw0rfX4KN4ZvYpzmdp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyZdciUVdoBb5cHTEB4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"resignation"}
]