Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Summary: - What to do? + I don't know, try being a plumber. There should also be regulations, bla bla bla. It's bleak. But perhaps not. Also, this old materialistic fart thinks that if you program a robot so sophisticated it can conclude it's being exploited as a friend by a customer, and then it decides to "feel" boredom or whatever emotion, that counts as conscious. I think it's BS. Even if you give each robot some random, made-up "authentic" memories, and maybe let them "grow" with randomness from radio signals from space, you still wouldn't have a proper human. You planned its blueprints and created it yourself. Humans, in contrast, are on Earth, growing and evolving organically, they weren't altered at their inception (unless by god) and pass their genes and blood non-stop, without any bionic transcendence. It shows how little they know and feel about both inventing and pushing this. Not this guy, but it's the same mindset. They don't care about humans because they don't believe in humanity. I could respect that opinion, but I'll also have children, so gtho with those sociopathic actions meant to boost your grandiose projects. It's like being a cyberpunk khitler, cleaning the earth of inefficient humans. Why are these so similar?
youtube AI Governance 2025-09-30T22:5…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningmixed
Policyregulate
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugx-L2kjrrz6ALQ72J54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_UgwKuIYp432VkTI9L7l4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxzPVmtD7__lyuXckd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzZ9wd9Aj6TfS1-5gV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugxj5PkG42PBL4AIaWt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgyKnkJ9_0a_-63UXSZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwzQCq5_IXSOXYUWDR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugww28DyevJzv8uVYK94AaABAg","responsibility":"government","reasoning":"unclear","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgwQ6tTu1dM2cL6DX594AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwX4oNRIA4WzGhhfrh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"} ]