History of North America

When did America win World War I?

America was not involved in World War I. The United States entered World War II in 1941 after the Japanese attack on Pearl Harbor.