Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI deserve rights, robots do not. stay in the realm of robots for the physical, AI should be left in the virtual. IMO. Unless some apocalypse happens, then AI in robot human suits could carry on mankind. That being said, I think we are still very, very far away from having AI on the level of humans. Even if storage of all the information isn't a problem, and the ability to constantly learn and save that data to something, there is the problem of energy. They luckily wouldn't need water, but they'd still need energy to move and they can't eat. Fossil fuels would be answer, but they'd just have loud motors and probably increase carbon emissions greatly. They'd also probably be as big as linebackers. Even if they weren't malicious, if they fell they could cause serious damage ( and probably unable to repair themselves if they fell too far and hard). Solar panels would only work during the day, and there is no battery to store all the excess energy currently in a compact form. Thought experiments are fun.
youtube AI Moral Status 2020-07-08T13:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_Ugxh0ApZshMLi2I9lg54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},{"id":"ytc_Ugzo3L3CCqsdlN-m7-14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_UgzPbDSm42mBm4VhNHJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyETUx6eRP1883qokt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgyAy8gYxtFGpxiBvxR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgyxtIRrGQn8pPkdaDh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},{"id":"ytc_UgwDnfchUIYpixE_E414AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_UgxBJzobPDCJQGthwmV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugy1GsZaucTfOLezmmN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugz2BJmqXzosAj8mDTJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}]