Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai observing to ai observing ai observing to ai observing to ai observing to ai …
ytc_Ugz6KwgX1…
G
what's going to happen? the economy could decouple from meeting human demand to …
ytc_Ugyy6KX6t…
G
> then those same ignorant team politics morons will blame everyone and every…
rdc_e2vrme1
G
Between artist, taking inspiration is under some kind of tacit agreement or defa…
ytc_Ugx4Tx52J…
G
What exactly would a fully AI company produce? You seem to fail to understand th…
ytc_Ugx580io9…
G
AI s have different levels of connectivity with humans and among AIs. We should …
ytc_UgzLe0fce…
G
When someone says, “ChatGPT is just role-playing,” what they really mean is: “It…
ytc_UgzmVWUTa…
G
AI should be regulated and stopped for stealing human knowledge and intelligence…
ytc_UgxMrEu48…
Comment
I have got a question :
I mean we are making Artificial Intelligence to make machines do our work. If they want their rights then it might be like they won't be doing work for us. I am not saying that they shouldn't be allowed to get their rights but what is the purpose of making them if they won't even do the work they are made to do. But if we are making them like to have friends and like that then ok. I am not saying don't make them or not make them. I just have this question in my mind.
youtube
AI Moral Status
2017-02-23T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UggGnfgJ2dwXGHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggCaMkzDkPu4ngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjduQeoeLF6YHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjMFF-zoS05A3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"unclear"},
{"id":"ytc_UghKeWexK3ypY3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ughy_952_NNC1XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugid6Flncn96MHgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgijakQOO8NP73gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiZnoQWHWW-JXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UghHBbOlXt0GlXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}]