The End of the British Empire

A British vessel narrowly escaping a German submarine torpedo during World War I.

In the first half of the 20th century, Britain’s dominant position in the world became increasingly challenged. Now, other industrialized nations began to be able to match or surpass Britain’s military strength, and in the British colonies, a growing number of people pushed for independence. At the same time, Britain got involved in two long and arduous world wars which, in many ways, exhausted the country of resources. By the end of World War II in 1945, as a result, Britain’s physical grip on its empire was rather weak.

Further weakening British colonial authority was the fact that, in 1945, many Brits started to have moral qualms about colonization. In part, this was because the British people had prided themselves on fighting imperialist dictators in Germany and Japan during World War II and could now not easily motivate going back and acting imperial dictator in their own colonies. As it seemed, considering Britain’s relative weakness and the Britons’ morals, it was only a matter of time before the remaining British colonies were to be granted independence.

Colonies become independent

In the decades after World War II, with colonialism quickly falling out of fashion, Britain, as expected, released almost all of its imperial possessions. Still though, London kept jurisdiction in a number of small overseas territories of strategic importance. Many of these territories, like Gibraltar on the southern tip of Spain and the Falkland Islands off the coast of Argentina, remain British possessions to this day, albeit with extensive self-rule.