Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wow, you jump right to Carla is lying. Not Carla is wrong, Not Carla is misinfor…
ytr_Ugz8bOCyY…
G
The usage of ai in warfare is really gross. The first time I heard about lavende…
ytc_UgyVwu1JX…
G
@Brogon_the_10st, no one of the biggest rules in art is no stealing and tha…
ytr_UgwUu9B2Y…
G
@RewindOGTeeHee some are using it as tools, yes. Some are trying to use it as re…
ytr_UgyQ9tRpR…
G
What gets me is the fact "AI" isn't even actual Artificial Intelligence, this na…
ytc_UgyldLYQd…
G
Oh no, so capitalism is clllapsing because nobody will buy your stupid things if…
ytc_Ugxp9wRwQ…
G
Well slowing down ain’t gonna happen. Especially the military grade AI being use…
ytc_Ugwf_O_Wd…
G
ChatGPT is programmed to pretend to respect and follow human morality only.
Whe…
ytr_Ugz4EmRqA…
Comment
Before you dive too deep into what this guy is saying, read The Two Faces of Tomorrow by James P. Hogan. A 1981 realistic Sci-Fi novel about this topic. Remember, humans write the programs and some of their personality will go into it. Plus, only business is really pushing this and business has an agenda that is not altruistic. In the novel mentioned about a rational government is running the show. As it is now, AI will end up being a mess of conflicting desires from greedy selfish people. Those same people run the government now. You should be worried.
youtube
AI Moral Status
2025-06-23T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxJAr1mG0P5wZ9Dg-94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxPJAjAu3wnf4NhJRl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxYE5GhFAqK7F4XJXp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyBlGQ3IW7uCGyPM1x4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwjJZm9HfMSmdYe-_54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugx8CCGmtoJ-LcJ28pZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwOoZ020ygMTDzFkkN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"mixed"},
{"id":"ytc_UgxR3RUSe4OVGx3pCfR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzDDdu8HGYfTaR9zNx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwAe-3dfgdd1TD0tjd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]