Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The missing piece of logic is that at some point AI will drive the cost of provi…
ytc_UgxqaJ0el…
G
It's actually a dentist robot that malfunctioned, yet it isn't self aware or smt…
ytr_UgwyHKS4r…
G
I do believe AI is the image of the beast that’s written in the Bible makes per…
ytc_UgyHy9zFh…
G
Microsoft have Copilot. Does that mean that if I take my laptop on a 'plane it …
ytc_Ugx14SgkP…
G
imagine a calculator that only returns answers when your equation benefits the c…
ytc_UgwQpTdLH…
G
Let’s develop AI based on human brain and let it reach the point where it inevit…
ytc_UgyVMlB9o…
G
AI will be dead by 2030 😂. It will only be used for medical and technological pu…
ytc_Ugz13EyNZ…
G
So this is how the ending begins. With people treating AI like a lesser being an…
ytc_UgxgT5eqx…
Comment
The apocalyptic future that I envision with AI is something like the anime Psycho Pass, where there is super intelligent entity that governs the society and calculates the probability pf people commiting crimes before it happens. It's suveying everything to "make a better world". I think AIs will become like CEOs of companies. Imagine an AI so good at running google or a government that people at the top say this will make much more profit than a person, let us give it all the power.
youtube
AI Moral Status
2025-11-01T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwqDZPwS0sJhzustSl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwfVIgjc9RUVbtK2Yx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwBkZ0RB2dzvKO0Wc54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyRx8kIRspv6bsRE4J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxDgzcIUZXZuAzgHSR4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzYdePcFg5OXhfaaV14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz8uz5IUjT5JCw33wF4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy3sz23nrUfIxdlCFJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwnojViKzl0G8CMj794AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyH4hSorqWq8zxU7AN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]