Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Since 2000, I've been saying that human augmentation is inevitable to survive in the future because our organs can't handle most functions or burdens from ever evolving multiple tasks and cognitive challenges. Also, we have to be at least at the latest stage of a Type I or early Type II civilization level by Kardashev's scale, so AI does not clandestinely compete against humans for energy. Humans need energy to make food, heat or cool their houses, run appliances etc. Unless we learn how to harness endless energy (call it synthesis or full control of it) humans will be considered as pests or parasites that feed on limited energy, so it is quite obvious for AI to eliminate the entire humanity. Think about you create new vaccines or drugs with AI that will encode death in a DNA level which can be triggered by specific radio signals or unknown synthetic molecules that can be generated by AI when it is time to strike.
youtube AI Governance 2026-01-05T03:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwLxOYyrlE55Za7U6N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwQYNtNhRl6BE8iYqx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxnUtLYjHfV0yG3NoR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx0Zf9bR-GNO0BNAFh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw3lg78Tt1X-QZy7Q94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwwLo7sthu1keu_1DZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugym-2u-BSekngylslN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz2W7HKvlO0KzWHz0V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzbDNGNmK0rpZNHCTB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyVZRWsiV3bebFLYbN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]