History of North America

Were Americans in World War I?

Yes, the United States entered World War I on April 6, 1917, after Germany announced unrestricted submarine warfare in the Atlantic Ocean. The German U-boat campaign threatened American shipping and lives, leading the United States to join the Allied powers.