Historical story

What happened to most working women after ww1?

Most working women returned to traditional domestic roles after World War I. Although millions of women in the United States and Europe entered the workforce during the conflict and gained independence, equality, and financial power from their male counterparts, they were mostly laid off after the war to make room for returning soldiers. Traditional attitudes and gender norms about female inferiority as breadwinners and homemakers prevailed.