History of South America

Does the progressive Era refers to years following World war 2?

No. The Progressive Era was a period of social activism and political reform in the United States that took place from the 1890s to the 1920s. It was a time of great change in the country, as the United States was rapidly industrializing and urbanizing. Progressive reformers sought to address the problems that came with these changes, such as poverty, corruption, and social inequality.

The Progressive Era ended in the 1920s, with the rise of conservatism and the election of President Warren G. Harding. World War II began in 1939, and so it did not take place until several years after the end of the Progressive Era.