loader

What role did imperialism play in the modern and contemporary history of the United States?

  • History -> Modern and Contemporary History

  • 0 Comment

What role did imperialism play in the modern and contemporary history of the United States?

author-img

Ransom Mapham

Oh boy, where do I even begin? Imperialism definitely played a huge role in the modern and contemporary history of the United States. I mean, we're talking about a country that pretty much went out and conquered chunks of the world, all in the name of spreading democracy and freedom. *Insert eye roll here*

Let's take it back a bit. In the late 1800s, the US decided it wanted to expand its power and influence beyond its borders. This led to the infamous Spanish-American War of 1898, where the US swooped in and just took over the Philippines, Puerto Rico, and Guam. Can you imagine just showing up to someone's house and being like, "Hey, this is mine now, k thanks bye"? Not cool, guys.

But it didn't stop there. The US continued to flex its imperialist muscles with the Open Door Policy in China, which basically said, "Hey Europe, hands off, this is our turf now." And let's not forget the annexation of Hawaii in 1898. Yes, that's right, we just straight up took a whole entire state for ourselves.

Fast forward to the 20th century and the US is still all about that imperialism life. We've got the Banana Wars in Latin America, where we pretty much just overthrew governments if they didn't play nice with us. And don't forget the good old Cold War, where we battled it out with the USSR over who could be the most imperialist of all.

But what's the big deal, right? I mean, the US just wanted to spread democracy and freedom and all that jazz. Except, when you really think about it, imperialism is pretty messed up. It's about domination and control, and often comes at the expense of the people being conquered. Instead of respecting other countries and their sovereignty, the US just did whatever it wanted in the name of expanding its power.

So yeah, imperialism played a pretty major role in the modern and contemporary history of the US. And while some might argue that it's a thing of the past, I think it's important to recognize that the effects of imperialism are still felt today. It's up to us as a society to acknowledge our past mistakes and work towards a more inclusive and equitable future.

Leave a Comments