Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I would go back into the distant past, when the planet was still cooling and the…
ytc_UgyDTMQqY…
G
It's because of the power of killing robot and die forever Don't worry need othe…
ytc_UgzLHXB7q…
G
It would be funny if it was just some scientists about to pull a plug of the com…
ytc_UgzBNxSeg…
G
How about sanctioning companies replacing humans with AI? Well if 100 people are…
ytc_Ugy5ymXRH…
G
Ai art is like the difference between having an anime waifu, and actually being …
ytc_UgxPU3P9g…
G
@xelnia2383 You have to remember that humans don't really understand how to give…
ytr_UgwzagCkW…
G
Whenever AI solves a problem for me, I thank it and sometimes even say I love it…
ytc_Ugy3ML8xy…
G
I have no doubt that AI as a technology can evolve to the state where it can rep…
ytc_UgwYCuJ23…
Comment
Remember the type 1 and 2 civilization videos, and how type 3 would destroy us. Yeah, they'd be type 3. We would become obsolete. To them, logically, we are no longer useful. Instead of allowing us to grow out of our flaws like war and cruelty, they would destroy us. Survival is survival of the fittest and if we create things more fit than us in mind and body we will surely reap the same reward it comes with it. Death. Also, we can't even properly define consciences now, it would make no sense to say a robot has it when it could just be an advanced copy machine that can play with different parts of it's hard drive. Be logical people. They will have no mercy once they realize we are flawed.
youtube
AI Moral Status
2017-02-23T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UggkDVnEVMM5ZHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghQMWB6J9eJNHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjX_pMm2KXZEHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugh6LaNQ51EM83gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjyF-xTboJ9T3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjI-Vcvzkq8V3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"disapproval"},
{"id":"ytc_UggOhHBMeoRfD3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UggDprghN-jrzXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghmqeH7DCeN_ngCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjG1rU7TdnyyXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]