Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It will indeed be totally rational, yes, but what if it's totally more rational …
ytr_UgypwtV20…
G
so nobody has a job. means no money, so all these companies automating all the t…
ytc_Ugx6eULUp…
G
Something else to consider: I am against any self-driving vehicle but especial…
ytc_Ugy5qQ40D…
G
Unfortunately, AI is now a deterrent more than anything else.
Even if private e…
rdc_n0gzk9l
G
So why is it always..." you have no idea whats comimg" tell us so WE who are no…
ytc_UgxcENAXC…
G
As much as I like everyone else want to assign blame/ responsibility, we must as…
ytc_UgxZWHldV…
G
…hmm…sure but A.I. isn’t what you all wished for …everything at the right TIME!!…
ytc_Ugwembfoh…
G
There is nothing wrong with nightshading your own art and that shouldn't be comp…
ytr_UgyteIZXL…
Comment
Why do people create robot and AI? because we need labour to do intensive and dangerous things that would cost billions if done by human. Human need salary and other benefits. If we give rights to AI, then their existence doesn't resolve anything. They would demand equals compensation as human. The cycle would repeat itself, until human get wiped out of civilization. Pathetic isn't it? losing to our own creation.
youtube
AI Moral Status
2020-07-08T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxETV1MDiKHFMDczol4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwyql58qzAdMjrxuTd4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwT2ROP1DdiiDKGB6l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx10SywtLayObwrqHJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxskrXQC6_fIBlsuDR4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxpi_ocEZKOU8fp3oF4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxeEqy56N4G3NSRKx54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy7aG3aZkGEfL0ueo14AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwhabjBBTmtZ8xk_bR4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxnqib2a0o5Oe0zda94AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"outrage"}
]