Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If anything, giving robots free will would be the most catastrophic thing you can do. They'd become just like humans, but far more intellect and (probably in some cases) incapable feeling pain, so tackling one out of way if you were to fight one for survival, you'd lose without proper weapons... Although all that depends really on robot too. Free will is what makes us do stupid things out of order, instead of set and calculated process that we would follow... So don't give robots free will or research on that. That's at least some of my thoughts. We don't really need any more chaos than what we already do... (Nukes was one big mistake to learn and research about for example... So now it's just matter of time when one blows up...)
youtube AI Governance 2024-04-03T15:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugy0O-CQLXFpU5fOhaN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzE38gYZJHSrcL1VVh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugy9v4p1qYDUI1choll4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwpREhCn8rovUo2Oop4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy1NmYI2gi6bujiTnJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwBcw7J-dzxQxV-SqZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxNRYQKZC35KQf5r2t4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"resignation"}, {"id":"ytc_UgyJx9Gqe-bHpXiQWoZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugy1pqx7b8-3QdFDK8J4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwcgVrr2LAttvXorXZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]