Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That’s also to mention it’s humorous. Charlie can laugh about it, which is a qua…
ytr_UgzNb5svK…
G
If you want to ask ChatGPT et al if this is true, here are some suggested prompt…
ytr_UgzQdsRNA…
G
It's less that it takes inspiration, many cases ai "art" just traces artwork and…
ytc_UgyvkpWCK…
G
Very interesting. And scary. But as professor Zuboff is saying, surveillance cap…
ytc_Ugz2wIfEj…
G
Why do you have this unrealistic expectation that every parent must have total c…
ytr_Ugzh_3YPl…
G
the thing that bothers me about the "ai will destroy humanity" argument is this:…
ytc_UgxCxEDMk…
G
I have a tolerance for ai, when it is used correctly and transparently
Want to …
ytc_UgwzdRDsj…
G
That is a fear i have
I worry people will ascribe human feelings to AI and start…
ytc_UgwUN8ezW…
Comment
well if you're robot do everything that the people command them to do well he like she will just do anything she says he said she said and he said if you want to the store humans and she will automatically say yes because she and he asked her that so that's kind of like this is not real and it took like a path a year from when does this uploaded anybody cares
youtube
AI Moral Status
2017-08-05T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxyVG4yFA_H9FFXxTR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwqvcZo0h-IRPrWsyJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxx7OqpsyohyDuc2hB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwoOnjX9XHS6zdKt254AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwRLR9c4CHzuizKVy54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwgjqoNSvhJdsY166d4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzvVKV9Vh6PF2X9ypF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgiSPSbAKBMaSngCoAEC","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghtO0zPzER0O3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UghNXdUApwhS_XgCoAEC","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"}
]