American Imperialism refers to the policy and practice of the United States expanding its influence and control over other countries and territories, particularly during the late 19th and early 20th centuries. This expansion often involved military intervention, economic dominance, and political influence, as seen in events like the Spanish-American War and the annexation of Hawaii.
The motivations behind American Imperialism included the desire for new markets, the spread of democratic ideals, and the belief in Manifest Destiny, which suggested that the U.S. was destined to expand across the continent. This period significantly shaped international relations and the global landscape.