History of North America

Were the Americans involved in WW1?

Yes, the United States were involved in World War I. The nation officially entered the war on April 6, 1917, after a period of neutrality. The American involvement played a significant role in turning the tide of the war in favor of the Allied Powers.