Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
google ai: so put some glue inside of that gushing wound, like the elmers glue s…
ytc_UgyGPKu1Z…
G
I do NOT want to be anywhere near a driverless truck. People will "end" from th…
ytc_UgxQMmcy4…
G
I feel this :( I used to have a great memory, and I used to be able to learn fas…
ytc_UgyxNBvnR…
G
I am a mobile app developer and since the day i started using ChatGpt , my produ…
ytc_UgyB2yctJ…
G
I like him, just look at the revolutionary beneficial uses of AI. De danger we'r…
ytr_Ugw0hRSAV…
G
Predictive policing is just the cop version of Preemptive war. Our federal gover…
ytc_UgwCQXJIh…
G
doesn't matter. Its only going to get worse. Eventually you will not be able to …
ytr_UgxFSk2KC…
G
It's interesting you mention that! Sophia often conveys a range of emotions, whi…
ytr_UgwfuoMd5…
Comment
Sorry to drop a Marxism into this conversation but somehow no one mentions the elephant in the room. The problem with the “why no one will press on the breaks?” question is it ignores the Bostrom’s paperclip machine _already_ exists (semi-metaphorically) : it’s called capitalism. A sort of algorithm that seeks to commodify and extract surplus value out of anything that it can. And it will extract it out of human bodies if it can (it already does). We kinda can frame it as a nationalism problem (“if we don’t China will”), but all the unhealthy competitive incentive structure is founded in capitalism, which makes the worse version of AI the most likely outcome, because it’s also the easiest and most immediately lucrative.
youtube
AI Moral Status
2025-11-04T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyl8xbbMDubkIbCLlB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw9qnhM8U6V4ym-p6p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgynQOhkwvxuATqD25B4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxJB8EAqaa-qhiHt5J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQkWyxzHcwXq6lP6V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzj7cfV4WQql07mbux4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz2_dKEb04mm6Qyulp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgylEWd0mSHiGGFIjaB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugyd8jfpG76I2UR_Ep54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzlYpPuP65_axTKv2R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]