History of North America

Did the US sign document that ended World War 1?

The United States did not sign the document that ended World War I, which was the Treaty of Versailles. The U.S. Senate refused to ratify the treaty due to its opposition to the harsh terms imposed on Germany and the creation of the League of Nations. As a result, the United States did not formally end its hostilities with Germany until the signing of a separate peace treaty in 1921.