History of North America

What year did slavery officially begin in the US?

There was no official beginning of slavery in the US. Slavery existed in the American colonies since the early 1600s and continued after the United States gained independence. The practice of slavery was gradually abolished in the northern states during the late 18th and early 19th centuries, but it continued in the southern states until the ratification of the Thirteenth Amendment to the US Constitution in 1865, which officially abolished slavery in the United States.