Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Simple, don't put AI in machines designed to do repetitive or "slave" jobs" why the hell does a toaster need AI?? Or a mining rig designed for digging, it doesn't need the intelligence to read Descartes, Plato or comment in the internet! At best, if you really wanted you could put a VI (virtual intelligence) in there, simple algorithms, nothing too fancy, just to be a little more efficient, and that's about it. I'm against putting AI in frigging everything, it's a disaster waiting to happen... other than that, well, we'll see what the future holds... if they really attain conscience we'll probably need to give them rights. P.S.: This is assuming true general purpose AI is even really possible, but that's another topic... Most likely we will get limited AI, and to be honest I'll probably be glad if that's about it, I think we are better served with VI. But who knows... if all this comes to pass, it's going to be a real pain in the ass to figure it out in the best case, and in the worst case it could end badly for all of us, AI and humans alike... :( I just hope I'm not being too pessimistic, but change throughout history has been hard and messy :S
youtube AI Moral Status 2017-02-23T17:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningvirtue
Policyindustry_self
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_UggjZy8rcjGNm3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Uginb6UDLQhk_ngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgicvaZjhMX6BHgCoAEC","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugh9eNuF4VTjWHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},{"id":"ytc_UggntMP2kdIoWHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_UgjEH-SjZ2pMMHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgiDLalxifsbkngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgiXwA6zw6dnqHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"},{"id":"ytc_Ugh2D6_lDm1Rc3gCoAEC","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"resignation"},{"id":"ytc_UghxYPawvqCsbXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"}]