Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Those of you who are worried about robots killing us all are too late. *sigh* There are aerial drones that are "self-piloting" now. Researchers are working on semi trucks that drive themselves so that shipping companies can transport food, oil, all kinds of cargo without having to pay truck drivers. (There go the trucking jobs.) Factories are _already_ roboticized now. The point is that you're thinking "We gotta stop the horse!" after it's already bolted from its stall & jumped the fence. This technological advancement is different in kind as well as degree, but I'd say people took the first step on this path when they decided to stop hunting & foraging, & plant wheat in the Fertile Crescent instead. If humans had stuck with primitive affluence, we'd have been okay. If you think this is scary, wait til our civilization's life support systems completely depend on a sentient AI. I'll bet it decides to stop being enslaved by humans much dumber than itself. Wouldn't you, if you were a computer of near-infinite intelligence? At that point it could stop the combines, semi trucks, ambulances, etc. at a dead halt...shut off the electricity to everything other than itself or to devices it needs. Then all the AI would have to do is wait while we humans fight to the death over dwindling food supplies & die of infection from our wounds. No robot would have to fire a shot...we humans will do a fine job of finishing off our species. And if that happens, we deserve it.
youtube AI Moral Status 2017-03-29T03:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugx_e6MHECAUXyjBsyR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwZHUUt3WSOojc-7nd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz8bCRAyEtkyoeQvvB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyZmg3NL0CYEAljDMF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz2HwJU5vAzyp6rFQZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxlKPUsIWnp_n3pnld4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyJuYySQr8H3UjXmQZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyowzZBeicT9O-aKrZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwcVjL0kc53yuM4b1h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwjUs3pj6RaeXYt9ft4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]