History of Oceania

When were the colonies called united colonies?

The colonies were never collectively referred to as the United Colonies. The term "United States" was first used in the Declaration of Independence in 1776.