History of Oceania

Are the US and Japan still enemies?

The United States and Japan are not enemies. In fact, they are close allies and have been since the end of World War II. The two countries have a strong economic and political relationship, and they cooperate closely on a wide range of issues, including security, trade, and climate change.