The South really won The Civil War, and the Confederacy never went away.
Face It. We now live in a country controlled by Theocrats and Oligarchs.
The South really won The Civil War, and the Confederacy never went away.
Face It. We now live in a country controlled by Theocrats and Oligarchs.