Social Studies
brendapimentelb3
39

How did the United States role change in the early 1800s?

+3
(1) Answers
miguelgallardo

During the 1800s, the United States gained much more land in the West and began to become industrialized. In 1861, several states in the South left the United States to start a new country called the Confederate States of America. This caused the American Civil War. After the war, Immigration from Europe resumed. Some Americans became very rich in this Gilded Age and the country developed one of the largest economies in the world.

Add answer