Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI will work along side with the people who would be helping AI to replace their…
ytc_Ugw4DR1vS…
G
The real threat ai poses is to aspiring and amateur artists. There was a place f…
ytc_Ugw1cAqwd…
G
Hey Sagar! I am a boomer and I don’t appreciate your stereotyping of me! I can’t…
ytc_UgwvknzPb…
G
ai takeover is in 2031, with all christians removed from earth by god, and it be…
ytc_UgyiBRwQQ…
G
Christ on a bike, that's a ridiculous ruling. There's no medical evidence that s…
ytc_UgzTMhPSd…
G
I got to similar conclusions I find by vectorial similarly (not sure it is still…
rdc_ohzprd0
G
For me this is daunting on an immesurable scale, I've been drawing since I was a…
ytc_Ugya1ajyx…
G
it should be noted here that it is pretty much illegal to work on your John Deer…
ytc_UgyBWPtiS…
Comment
So we are essentially building our replacement? Is that what you guys think? We are building something that can take over ourselves? Displace us? Then what are we going to do when all the things are taken over by robots? Die? Leave the planet? Wherever we go we’ll do the same it’s ingrained in us this very comment I’m writing is a product of my experiences and is being used to train models. Those models will produce and end up being services used by some other ai. What’s the end goal? If there are no humans to consume those goods to live our lives. You guys are just stupid and don’t see the bigger picture. If we are building such a thing is because we are supposed to. We are evolving
youtube
2025-04-24T15:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgykwUlT5zj0Hm9vAp94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxqS7gztEwRC1DBmmp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugy0ryeza2oEFqlMcBJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwNm16tE9PETV1TGJp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxhnfn_asJGkv6O8x54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzjt5tPW2GS0BqOR8Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwFxxJ5n0vaAx8vOK94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxiHerhyi3lFqHrU3R4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugz0HPUe4zx874lqnJl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzxJy5LbFo9tL_oqbZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]