Historical story

What happened to women workers when World War II was over?

During World War II in the United States, due to the massive mobilization and conscription of men for the war effort, job prospects for women expanded during the war as they stepped into roles traditionally held by men, such as building ships and working in factories and other industries. When the war ended in 1945, the majority of these female workers chose to give up their jobs in the public sphere to return to the domestic sphere of home and family, despite the fact that some continued to work in male-dominated industries. While some industries, like advertising, embraced the idea of women in roles traditionally held by men, others saw the return of male workers as a natural and desirable consequence of the end of the war.