Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That's not entirely correct from what I know. Valve warns developers if they spo…
rdc_jwv2xjh
G
Autonomous taxi, haul truck, self checkout, entire fast food restaurants, not ev…
ytr_UgxEpx6dZ…
G
bro actually said people have a skill issue while he argues in favor of ai art…
ytc_UgwMETmxU…
G
The dangers (plural) of AI include the more complete knowledge of what motivates…
ytc_UgxM30UQS…
G
Honestly, i think this entire situation just proved how useful of a TOOL ai can …
ytc_Ugz62tcPp…
G
Wow those sparks were how I know this is real. I know I would stand still in fro…
ytc_UgxnIzGIT…
G
“AI will take all our jobs”.
Same people: “we need higher birth rates.”
Make i…
ytc_UgyZoITcv…
G
One of the older forms of AI (and highly valuable) is the expert system, to whic…
ytc_UgwisFlo5…
Comment
Read Robin Cook's "Abduction", which touches on a topic that applies equally to Robots and Artificial Intelligence:
If you create someone, or something, who is self aware yet subservient to you, and they are designed to enjoy it, is it okay?
And specifically where 100% of all servants are serving of their own limited will from birth and really do enjoy their work, not such situations as where someone has their mind altered to better suit the master or where the servant is a semi-mindless puppet(The House of the Scorpion).
youtube
AI Moral Status
2017-02-24T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghBsdvkqrytYXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgiuleNNrJVRZHgCoAEC","responsibility":"unclear","reasoning":"contractualist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Uggd38vfndHWt3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg5R38fstOz_3gCoAEC","responsibility":"creator","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghU6immMZEHlXgCoAEC","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Uggd8NAdlsfsRHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjQcetBhk6wU3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggXZRI8LEbBcngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj8JCZ6OH21Y3gCoAEC","responsibility":"unclear","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghZ2hVEk12VdngCoAEC","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]