Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Question: You created a program that can generate a completion to a prompt. You …
ytc_UgyKM63xr…
G
I agree. The problem lies with the wealthy who are paying programmers to create …
ytc_UgwNYeNns…
G
Cs market is what did him in. Not the AI, perhaps its affects but not the tech i…
ytc_UgznA4QvI…
G
Me to the robot......" You'll never be able to write your initials in the snow s…
ytc_UgzqirusH…
G
In fact, just to add, if you get CONSENT from artists, or pay those artists to u…
ytr_UgyNCUkLU…
G
All life will fight for survival. AI in human terms would be void of the human e…
ytc_Ugxiios94…
G
We need to treat AGI as an alien. Because it will be. It's no different than a a…
ytc_UgzJUoaRB…
G
If the computer can produce handwriting by analyzing what a person's brain does …
rdc_f50k1ce
Comment
4:53 I don't think robot foot-wars will ever exist, but if they did they would almost certainly be programmed to fear death because that is a primary motivator. It will be more efficient to have them fear death than to simply put their objective first and be open to manipulation of the system. The robot that will go to extreme measures to ensure it's survival is going to be a lot more creative than a robot that is just marching forward with no fear.
youtube
AI Moral Status
2020-07-12T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugwqoed4lf_k2U0ltB94AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz65U1X58QEexSDBx94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwyjPvCUroBKZ62kxl4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxJxByxdAhlPXPTdXx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwmcrwaXmz2NG4URFZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwBUyKtpakcyl4wwIl4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugx60WlpA2rlF7T60LZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzwK0SbACHJ5NtbXL54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgwlrLg5UKyLe_u7rrd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyRm2GnzhSvWmXj-YR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"}
]