Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And soon they'll have self-driving trucks with delivery robots. They'll get rid …
ytc_Ugx-5m5ql…
G
Since AI is learning from human knowledge free of copyright.. Only fair that AI …
ytc_Ugz6muggR…
G
@Soumaouma10 That is an epic backdown. AI is the term us plebs use for all those…
ytr_UgxAcioiW…
G
I feel like people should calm down and inspect the situation a bit more. From w…
ytc_UgziRGoeE…
G
@Buddybhe yes and we accept that. we treat humans w less trust than we do ai.…
ytr_UgytNR7bm…
G
As an AI artist, you know nothing. I maintain and clean my keyboard like every a…
ytc_UgzrH5V_d…
G
I wish AI would develop faster so that all non-blue collar labor is fully automa…
ytc_Ugyhn--9P…
G
Sometimes I wonder if "artists" Supporting Ai are truly an artist to begin with…
ytc_Ugx6f0abu…
Comment
Manufacturers have been replacing humans for over 100 years. Institutions have followed their example. The AI revolution is simply a new step in this course of events. However: best leave the word "simply" out, as things may get more complicated. In any case: tremendous effort/research/investment to replace humans will result in .... replacement. Don't claim it comes as a surprise! There are good sides of robotization, too, incredible good sides. Big Tech, however, will not be able to restrain itself, so dollars will prevail to a dazzling extent over human values. Furthermore, politicians and policy makers, law makers, have no clue about any of the implications. So the key question is: Can humanity adapt quickly enough to innovation at rocket speed?
youtube
AI Moral Status
2024-01-14T13:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgzQVPp21D7fijQtH6V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyz3pdYsNQm8ykzPl94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy5_I0RU5Vz2Bhdp3V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxPNgUrY5F-UC89K8t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyLNq6gix5QR4ASGKV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy-x1muP81WbO_gCG94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyipU49xgmUPRxi_Kl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_UgysFb8XZ2S1tig3HZF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwZRiRbZP11KkvOCqt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwkuTvtWGPgc2Skp2p4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"}]