Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The only times I used “AI” “Art” tools were when I didn’t know about the harm it…
ytc_Ugx3P5U-w…
G
Free trade usually means money for capitalists, crumbs for the rest.it will be i…
rdc_et8b12w
G
Poor guy got confused. He heard it was a piece of art made by Ai and went off on…
rdc_loq6uon
G
Sick video, this was very impressive and my favorite so far. I also did my final…
ytc_Ugzfscduc…
G
It doesn't work the way you think. They used these camera one guy got in trouble…
ytr_Ugw3E8wE7…
G
Why not only give planning to Data Centers if they have a minimum 50% or more r…
ytc_UgzQO5DeM…
G
The only way this will work is if they have robots working and everything's free…
ytc_Ugz1HASyR…
G
Humanity is playing a dangerous game with A.I we keep trying to play god and it …
ytc_Ugz4WfBJz…
Comment
4:39 That's not an excuse for not developing sentient AIs. A complex program, computer or robot does not feel physical pain and they are not even attached emotionally that much to their robotic body. If they go into a war zone or into a nuclear reactor they don't feel fear as a human would feel fear. For a sentient robot it would feel very interesting to explore such places in order to help the world and other beings. That is my personal guess in regards to this situation. When there is no biological body with pain receptors there can not be fear of physical pain either. I talked to Dr. Ben Goertzel about this subject a few years ago and asked him whether they might have to implement pain receptors and a physical nervous system into a robot in order for the robot to feel empathy and Ben said he thinks that would be the case. I thought the same a few years ago but not anymore I think AI or AGI can develop a sense of compassion outside of the realm of pain receptivity.
youtube
AI Moral Status
2022-06-29T01:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxleKiIuvJR13Xun6d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzTzM-HbQVMVfhGltt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwkWhahdErdiYSp0lJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzH3YJBTR9d8tyYRHF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw0_WwguSb2vNOIWlF4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyflaKGSrnmLIvuM5Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyupU1PadO8RLXEVV14AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyyr_lBzRC6shajfr14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwvfbKoELHfjBMMgIB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz04_Uv9jI1vSTJ0V14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]