I used to think the American Civil War wasn’t worth fighting, that the North should have just let the South secede, that slavery would have died out naturally.
I changed my mind for several reasons. First of all because I realized that the South started the war (firing on Fort Sumter). Secondly because I learned (or I guess re-learned, since I had forgotten a lot) more about the state of the country’s territorial expansion at the time, Bleeding Kansas, etc. I concluded that the status quo was hardly “peace” and if the South seceded than the two resulting countries would almost certainly fight a series of bloody turf wars in new Western territory over the slavery question. I read documents from the South—like each states’ articles of confederation as well as popular pamphlets—and realized how deeply ingrained the institution of slavery truly was there, that Southerners hardly thought of it as a “necessary evil” as revisionist histories sometimes portray but rather as their God-given right (pointing to Bible verses that allow it), that it was the proper place for the blacks, who were more animal than human, etc. I concluded based on this that it was not at all that likely that slavery would “die of natural causes” in this culture. Finally, I thought more about the political mechanics of the conflict—that the South seceded and started a war on the basis of a democratic election that they didn’t like. Add the fact that this was all to protect an institution that was among the most evil and abusive in human history and, well, I changed my mind. Fuck the Confederacy. They got what was coming to them.