Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The first thing is that they should not make a single robot in humanoid form, they should not look like us, we should never confuse them as human, they are not us. The biggest issue is job loss and what humans will do for money and how we will function having no purpose. The companies developing AI will become quadrillionaires, and many of those who are developing AGI have no interest in saving humanity. Why would they want to give us universal income so we can survive? They have exactly zero motivation to take care of us, it does not benefit them, in fact ultimately they may view us as the insects using resources they need for themselves. If they cared about the survival of the human race they would have chosen to do this only when they put every safeguard in place first. We can all thank Sam Altman for safety being completely abandoned. They are far more interested in their own salvation so they can witness what is to come, we are just something getting in their way!
youtube AI Governance 2025-12-08T21:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxU_zhG_Jo59YxLJRJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzhyhMkmGf8kCJK1RB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxM728SphNwsfrOr-d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugyy0ZIV6sTro2cEUf54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxkrnOJh5y8fnIp-th4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugy8jUXR8BLjZlxC_a14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_Ugw6mt-dDKEoqBj4pOJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugys_yRWukI_tTyMULB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwpEHZYFZqdlwhOxbx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_Ugyq0OHXDF5CIU60I994AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"} ]