Sunday, March 20, 2011

Is/Has the U.S. Been Imperialistic?



I think that America was somewhat imperialistic in the past, but we were in no way imperialistic in the same sense as Europe. Here's why:

-European countries placed a lot of emphasis on the three C's: Christianity, Commerce, and Civilization. America hasn't placed nearly that much emphasis on any of those, and we rarely even come close to even being able to use one of those as justification. Sometimes we go to aid another counrty (possibly civilization), but we don't look at them as being a backwards culture, just one that needs aid in some way.
-America doesn't want more land just to expand our power or resources. We didn't go over to the middle east to take their resources, and we don't just plan takeovers to obtain workers, material, land, or power.
-While the United States has gained territory in the past, we adopted our last state in 1959, and have not recently shown imperialistic tendancies.

Some may argue that the United States is an imperialistic nation though, and here are somw reasons they use to back up their claims:

-America has territory like Peurto Rico and the Virgin Islands, which are not actually states, but are often referenced as 'the 51st state' and such because of the way we treat them and our power over them.
-In the past, we have been quick to show more imperialistic tendancies, and there are still left over effects and realms of control from those times.
-We are still going over to other countries and working with the people there to change things (such as in the middle east). While I don't think these actions reflect imperialism, there are those who argue for that, so it has become a common arguement.

No comments:

Post a Comment