Oh gawd, no. It would arrest human cultural evolution and freeze it in place in ways that would quickly become maladaptive.
Human societies are organic wholes—meta-organisms, if you will. They are swarms of memes, zeitgeists, historical and economic trends that ripple, cascade and tip thresholds that we haven’t even begun to define, as they compete for survival in the marketplace of ideas. The processes of collective deliberation, innovation and social control simply can’t be turned over to a machine without sacrificing the collective intelligence that human survival depends on. (See James Surowiecki’s The Wisdom of Crowds. )
First, who could you possibly trust to program such a thing? The temptation to build in advantages for the programmer class of people at the expense of everyone else would be positively overwhelming. The temptation to favor one ideology over another would likewise be irresistable—and would inevitably force a consensus on society that it would be perceived as oppressive if, for no other reason, than because it was not organically evolved. Imagine some religious nut trying to program his ideas into the system. Imagine being a religious nut and thinking that the system stood between you and your salvation?
Second, what kind of fail-safes would it have? Any system of government has to have coercive powers. We have problems with concentrations of power even in a democratic society with built-in checks and balances. Imagine concentrating this power into one central authority, and then arming it with a tireless surveillance capability over the most minute aspect human life, and then turning this over to an algorithm?
How could you possibly build in sufficient error-correcting capabilities that would not be overwhelmed by the chaotic dynamics of population-based systems? No, trust me, it would be much more benign and more efficient to allow fit forms of government to evolve naturally.