History of Asia

Did the US forced all countries to disarm after world war 1?

The answer is: no

The United States did not force all countries to disarm after World War 1. However, the Treaty of Versailles did require Germany to disarm, and the other Central Powers were also required to reduce their militaries.