History of Europe

What happened to Germanys colonies after World War 1?

Germany lost its overseas colonies as a result of the Treaty of Versailles, with the exception of German South-West Africa (renamed Namibia or Südwestafrika by South Africa) which was administered by South Africa under a League of Nations mandate. Most of the former colonies were transferred under League of Nations mandates to victorious Allied powers, and remained under their control for the interwar period until World War II.