Share on -Almost everything about American society is affected by World War II: our feelings about race; our feelings about gender and the empowerment of women, moving women into the workplace; our feelings about our role in the world. All of that comes in a very direct way out of World War II.