Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hottie. I wonder if it's molded and cast from a real person or designed, AI, 3D …
ytc_UgwYCORle…
G
Im waiting for the day AI takes over and all these bitches using no filter AI ch…
ytc_UgzPbMg8D…
G
There is a market for driverless cars, but, please stop supporting bad transit c…
ytc_UgxwAml0q…
G
Its not going to go well. It’s not going well now. The environmental, social, ec…
ytc_UgwDk_sAN…
G
Turns out the globalism smart Republicans helped implement was what exactly woul…
rdc_e2vzduq
G
I'm a heavy user of AI, and so far when it comes to actual creativity it sucks. …
ytc_Ugz0OBkwC…
G
some try to humanize content, but Winston AI still catches most of it. it’s shar…
ytc_UgzyMf8vM…
G
At least before we entered a jobless dystopian society Peter thiel has finally c…
ytc_UgyrxHm3H…
Comment
Many AI vs humans scenarios have been played out in science fiction novels and movies for decades. Some of it was harmless and entertaining but most of it had consequences that didn't bode well for humanity. Therefore, who shall we blame if the AI take-over actually happens? But by then it'll be too late to object, my little lambs, as you'll already be well on your way to slaughter.
youtube
AI Moral Status
2021-12-15T23:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzb2OyhQ9FQdX_dlQN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzI_M1FPfJJbx1IjFB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwQGjgxuyvl9BcQEul4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxOmCH-jSfIr53ZgAN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwWVwVkUmoXuhhSFw54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxfLKA9bGYyRt_iBGZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyGFnxOS-ZLLxBtdPt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwO1KZ46YN57BkC_5d4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyUytHI0WTGI-nMVaV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz57-0JyhgG3N6M-Ll4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}
]