History of North America

When did California become a teritory?

California never became a territory. It was acquired by the United States from Mexico in 1848 after the Mexican-American War and was admitted to the Union as the 31st state in 1850.