History of Europe

What year did the us sign a peace treaty with Germany?

The United States did not have to sign a separate treaty with Germany at the end of World War II. The unconditional surrender of Germany on May 8, 1945, effectively ended hostilities between the two countries. The United States, along with the other Allied Powers, continued to occupy Germany and develop a plan for the country's future. The occupation ended in 1949 with the establishment of the Federal Republic of Germany (West Germany) and the German Democratic Republic (East Germany).