Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The Cisco MD is the example of what AI brings.
"Fabulous things that made my lif…
ytc_UgxfCtZLu…
G
I fucking hate Sora, I hate ai in general it destroys everything I have such a g…
ytc_UgzdfZ05G…
G
The urinal and the banana are bad examples. Probably Bansky, Roy Lichtenstein, o…
ytc_UgwHz2-pb…
G
It's too late, the genie is out of the lamp and there is no getting it back in. …
ytc_UgzaBmaY5…
G
You don't have to try convince it that it's conscious. I talked to it many times…
ytc_UgxuvjHL5…
G
This has to be a bot there’s no way someone could believe consciousness is possi…
ytr_Ugx-0Dfgp…
G
Ai is simply evolution at work my friend, This video and the rest of YouTube wil…
ytc_UgxOhSujj…
G
Thank you for your detailed arguments! I wish to all the inspiring and pasionate…
ytc_UgzRr9wUN…
Comment
I only think AI should be u programmed to do specific things by the people who engineered it. If it was supposed to be "human" then it would take forever as a human could have infinite outcomes. If it was for business then feelings would not come in handy as it would make automation harder.
youtube
AI Moral Status
2017-02-28T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgiKYV8v9JQYg3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjCkPtC30Z9mngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjmL9PTUYn27ngCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UggSRmUXxp_mdngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjhwwXIci4w4HgCoAEC","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugjo_2qmwrEy2XgCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugi2Yut5usR3QHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgjcyN9r0FMRwHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgjSd-41hV6ELXgCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgiTz-lvV3YGIHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}
]