The South is increasingly dominating American Life in public and private: Southern politicians lead both parties; the Southern Baptist church is the fastest-growing denomination in the country; Southern food and music have swept the North. How did the South come to occupy this position? And what truly is "the South"?

Additional information