Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Everyone keeps asking if a AI will kill humans, why is that even a question to b…
ytc_Ugz1E468Q…
G
calling yourself an "ai artist" when you just ask a computer to make an image is…
ytc_Ugw0E5v02…
G
man leave them alone i know it's ai and all but come on at least say to stop sel…
ytc_UgyG5Q6T8…
G
Let's be real, artists are only furious now because they're standing to lose the…
ytr_UgyQtLDdZ…
G
I often think about One Punch Man when there is this debate about AI. A first th…
ytc_UgxLJA9FR…
G
@adrian-q9o8zYes, of course LLMs are AI, I'm not disputing that. LLMs use CNNs,…
ytr_Ugz_tglHd…
G
Saying you're an AI artist has the same vibe to me as someone who can only make …
ytc_UgwH0M-9w…
G
First off, thank you for such an in-depth comment!! I love that! It's super inte…
ytr_UgxzMpC8q…
Comment
He be cutting them off because he knows when they are about to say some sinister shit! The all three of them is evil and the make robot is clearly giving us the devastating heads-up about humanity while also throwing the guy under the bus by saying some shit he remembers the guy said which was not good so the guy walks over to the robot throw his papers down and got up in the manbots face with his hand language trying to signal it to watch what it's saying while looking nervously at the audience which they must be bots too because not one of them was smart enough to have any concerns about these bots Is also strange and very weird.
youtube
AI Moral Status
2020-12-08T16:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzQN6q9j45A04XMVW14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxhYp_QHEegHA8ozB14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw5vnG5bvRKH13P0vZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzJ-0cpZvFqCUNekxh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy2rMJvRj8rfgIFxfB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx2CqEljYCJFGiqJXd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzS9fu0ftEPXgMtOcx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwbrbvAxHH7qcGNM7N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwao_1Bf3SxIDhZHrp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxRsGBLDwGJTqjREFN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]