History of Europe

What event change the US stance on neutrality?

The attack on Pearl Harbor on December 7, 1941.

The United States had been officially neutral in the early stages of World War II, but the attack on Pearl Harbor by the Imperial Japanese Navy led to the United States declaring war on Japan and entering the war.