What the War was Really About
If you have to ask “what War?” then you probably aren’t from the South. The War is the key event in American history. Nothing comes close. The War for Independence founded the United States, but The War for Southern Independence has come to define American history. It dominates American memory. Unfortunately, as this excellent article at The American Thinker explains, that means most Americans have, …more