Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The A.I. everyone is referring to is better termed a more advanced expert system. If one feels that a cybernetic device is aware of them under conditions of everyday life or in warfare, it is only the result of humans anthropomorphizing these objects. No matter how sophisticated, they are only expert systems driven by rules. And remember, these are limited to binary coding. No matter how complex the algorithm, no matter how complex the program, they are expert systems with very, very, very limited autonomy. The prohibition against attempting to integrate an array of dangerous instruction code involved in recognition of targets would, in experimental stages, result in "self-aware (I'm using the term very sparingly) the drones or robots attacking the people building them. Their threat is overwhelming. And if another country tries to use drones with the always-poor ability to discern friend from enemy, our drones would be more numerous and own them on any battlefield.
youtube 2018-04-03T15:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwOHWOKk4mrzdvx6r14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwD5zjsOKm381BRwkp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwKd151bVJvhA7QixJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy6QvnlNOFGFQu3fph4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzhO-M52B0Uwg-KnsF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyMkcekidWccs1Q-Pt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyoglIh54yKKwxPFFh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxWqUR3rjToEfHbbWB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugytlda4Gn_CBnVOaCt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwP0Zt_2THvcx3nmvV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]