History of Europe

Who ruled the west did of Germany in 1961?

The Western Allies (France, the United Kingdom, and the United States) ruled the western side of Germany in 1961.