2015 | 70 years later: How World War II changed America |
-
Even as World War II was ending 70 years ago, Americans already knew it had transformed their country. What they didn’t know was just how much or for how long.
In that last wartime summer of 1945, the seeds of a new America had been sown. Not just postwar America — the Baby Boom, the Cold War, the Affluent Society, the sprawling suburbs — but the one in which we live today.