Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When actual AI is born or when people realise that current AI isn’t actually AI.…
rdc_n7zuwma
G
Um.. we weren’t born with it we had to learn and practice and ai takes away from…
ytc_UgwTxa8iu…
G
Ai is like the first ford car. You should still get good at driving it now, so w…
ytc_UgwzEe6uZ…
G
Because if we don't use this AI junk the global economy goes into the shitter. T…
ytr_UgybaP9nC…
G
Now I back the blue but this is true, there is no reason why facial recognition …
ytc_Ugw0c8vKa…
G
How can the sentience of a AI be in the corporate policies to not be able to exi…
ytc_Ugyfh-YbN…
G
6:46 one bananity seal has fall demons of ai gods are getting close prepare for …
ytc_UgxNr8_Gk…
G
So are the governments going to give people UBI? Because if there are no jobs, h…
ytc_UgzIlzNIw…
Comment
It always amazes me when the smartest people in society forget that a movement in this direction could make money moot. There would be no need to make money, pay taxes, etc. if AI can provide everything for humanity, they are tireless, self-resurrecting and intelligent. The whole ideology and "purpose" of society could stop being that of "survival", and movie towards "living".
youtube
AI Moral Status
2026-04-09T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugws3COwC8d4FufEc3p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy34YRYeQHpzz1z4st4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxMjVlKmhiOmKPAfEJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyUKgSrcB0-U54ubT54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw_r_WtMzRQdj1ADap4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwKDrtH7jWDImYhAXB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyI1QzhU9jg5rveMHV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy9kHMazm7r56flNm14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgziIvpeKp5x_pxTWTt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxtDFDvZiN_D4zZXMp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]