How the 1960s cured America
Forty years after they ended, the 1960s remain the most controversial decade of the 20th century. Either you believe that they destroyed America, or they cured it.
Put me down as a fervent believer in their success as a cure.
Before 1960, only undivorced white Protestant men had ever served in the White House. Almost 100 years after the Emancipation Proclamation, African-Americans lived in segregated communities and attended segregated schools on both sides of the Mason-Dixon Line, and none attended the all-white state universities of the South.
Gay people were completely invisible, except when they were fired from the federal government (or any company doing business with the federal government, where they were also banned from employment.)...
comments powered by Disqus
- Stanford historian uncovers the dark roots of humanitarianism
- Historian hailed for offering a history of the culture wars
- Scholars to set the West straight about "Apocalyptic Hopes, Millennial Dreams and Global Jihad"
- Why Eugene Genovese’s 2 sentences about Vietnam went viral in 1965
- Historians named to the 2015 class of the American Academy of Arts and Sciences