Category Archives: Colonialism

Empty Tributes and Avoiding Change

A recent discussion concerning cultural appropriation has identified an interesting question that needs pondering. Should it be considered an honor to have something attributed to a group, even if the thing in question is not a traditional piece of the group’s culture? Why, therefore, would a group of people be irritated or offended by such an honor, such a tribute? Upon pondering this, another question is arises. Who does the tribute actually benefit – the recipient or the one bestowing the tribute?

The tribute that generated these questions concerns the technique of chain-plying, a yarn spinning technique believed to have existed throughout the world prior to modern history. In the United States, this technique gained the name of “Navajo plying” because the indigenous people, the Diné (or commonly known as the Navajo), were known to use this technique in their weaving. It was not necessarily a traditional spinning technique for them, but rather a way of finishing a woven product. Therefore, it begs the question that wouldn’t referring to the spinning technique as Navajo-plying be incorrect or an empty tribute?

A tribute that is empty, not directly associated with any reason to honor or give acclaim, has inherent problems. Primarily, paying tribute without there being any real justification is often the result of a desire to feel better about how one has treated another. In simple terms, a tribute of this kind is made from a desire to make amends for past and/or present actions ill in nature. Therefore an empty tribute benefits the one bestowing rather than the recipient.  Does this tendency derive from racism? Is it merely a byproduct of colonialism? Can it simply be attributed to the notion that another culture is exotic and desirable? Or is the tendency simply paternalistic in nature – the notion that an honor is being bestowed on a lesser society who should be grateful for the tribute?

Throughout history, society has experienced the clash of cultures. It has also experienced the blending of cultures. Scholars now consider how the cultural blending of the past affects the people of the present. In particular, the question is raised as to whether the cultural blending of the past provides equanimity or discrimination for current members of the society. These considerations, and subsequent calls for change, have caused their own clashes of culture. In recent times tributes, particularly in the form of statues and monuments, have become the catalyst for heated debates and deadly violence. While these tributes may have originated out of differing intent than the empty tribute describe above, when they are challenged the reaction is quite the same. While is surprises few that challenges to statues and monuments associated with historical identity generate conflict, it may surprise many that something as seemingly simple as what people call a spinning technique, generates similar conflict. The heated debate over, and attempt to correct the name of a spinning technique highlights issue: change often causes someone to feel a sense of loss or inconvenience. One would think that by changing to a more universally understood name, one which is of greater descriptive nature and is already in general use, no one would feel a loss. At most, only a small inconvenience might be felt as an individual becomes accustomed to a different name. However, even when change benefits another individual or group, and where the change is of minor inconvenience, the change can generate a sense of loss for some. It can even generate a fear of greater loss. Therefore avoiding change, particularly when it means holding onto empty tributes, seems reasonable to many.

 

Additional Materials:

The Age of Homespun: Objects and Stories in the Creation of and American Myth by Laurel Thatcher Ulrich

Deculturalization and the Struggle for Equality: A Brief History of the Education of the Dominated Cultures in the United States by Joel Spring

Video blog on terms used in fiber arts by Abby Franquemont

 

 

 

 

History: More than a Story

Broad based or narrow focused, history is not merely a collection of data, rather it is a story. At times, the story may seem dull, at other times captivating. The study of history can introduce us to the challenges and triumphs of the past. It can help us see patterns in the ‘action and reaction’ cycle of human relations.  It can help us learn from the past events which have paved the way for present actions. However, it can only teach us if we are willing to learn. Simply hearing the story is not enough. Regardless of how enthralling, action-packed, or awe-inspiring, history is not simply a story to be heard. It is a story to be understood.

Whether we look at the rise of Hitler, the arms race of the Cold War, or the growth of empire through colonialization, history can teach us about how groups of humans react when they feel threatened by other groups of humans. During the inter-war period in Germany, the people felt sorely abused by the rest of Europe. They sought a change and a savior from the economic oppression they felt was unjust. During the Cold War, citizens on both sides sought powerful military might as a means of protection from a threat often ideological more than physical. They didn’t simply want a powerful government, they wanted an all-powerful government that could protect them from phantoms as well as from armies. In both of these historical stories, if we take the time study them rather than simply hear them, we can learn that people are willing to give up basic human and civil rights in order to feel protected from outside threats. Additionally, if we go beyond the simple narrative often taught in history primers, we can see cases where people were easily persuaded to put aside their moral compass in order to achieve group affiliation and protection. While the story of Hitler and his atrocious reign of power might more easily provide examples of how people can become swayed by nationalism and nativism, the story of the Cold War also provides examples. Foreign relations, the relations between nations rather than individuals, often times reflect the very nature of human relations. Just as human and civil rights were often trampled upon in both the United States and the Soviet Union by their own respective citizenry, national sovereignty and the right to self-determination were often trampled upon by the superpowers as they spread their economic, political, and military influence. The notion that ‘might makes right’ was not constrained.

The notion of ‘might makes right’ is clearly depicted in the colonization period leading up to the twentieth century. Peoples who seemed to be less civilized in comparison to the social and political norms of Europe were to be suppressed and subjugated, or eradicated if they would not accept their place in the more ‘civilized’ society. Moral qualms were assuaged by dehumanizing those who did not fit the norm and who did not hold the power. This was not the first time the process of dehumanizing the ‘other’ for social or political gain occurred in history, but it did normalize it as culturally acceptable. Even as slavery lost support, colonial conquest and rule, including the westward expansion of the United States, reinforced the idea that certain peoples were more valuable than others. The mighty western nations viewed their culture to be better than the rest, and believed that forced assimilation was right and justified.

To the victor goes the spoils and also the chance to write the story, but history is more than just one person or nation’s account. It is a compilation of stories from many different perspectives. Like the heroic sagas of old, history can inspire and teach lessons to the listeners, but the study of history can do more. It can dispel notions that any one group of people is more perfect or more sinful than the others. It highlights the shared humanity of man; a humanity that is full of valor and full of vice.

Power and Chaos

Prior to the chaos of the French Revolution and Napoleon’s meteoric rise to power, three great powers balanced the Western World: Great Britain, France, and the Ottoman Empire. The Far East and the Americas were still peripheral, with only the United States disrupting the colonial empire system in any fundamental way during the eighteenth century. Throughout the nineteenth century, the three great empires faced ever-growing challenges as nationalistic zeal spread worldwide. In response to the chaos created by the both the French Revolution and the Napoleonic era, the great powers of Great Britain, Austria, Prussia, and Russia chose to form an alliance that they hoped would prevent a repeat of the decades of war. They also redoubled their efforts to contain and control their own territories. The great threat to political stability came from two entities: empire seekers and nationalistic zealots. Control and contain both, and it was believed that chaos could be avoided. Yet as well conceived as the Concert of Europe was for the age, there was an inherent flaw in the concert system. The very nature of forming alliances to prevent imperial expansion or nationalistic revolution also entangled the great nations, and would, in the early twentieth century, lead them into another great international conflict. Fear became the demon; fear of what would happen if a nation chose not to honor the treaties and pacts.

The twentieth century saw the rupture of empires and the colonial system that had made the empires great. While the rupture was often bloody and chaotic, there remained a level of control because as the great empires of the past declined, two even greater empires replaced them. Historians and political scientists argue over whether these two great nations ever became empires in the true sense, or if they were only empires of influence during the second half of the twentieth century. They do, however, agree that the influence of the United States and the Soviet Union during the Cold War suppressed a great deal of the chaos that might have erupted as colonial shackles were lifted and fledgling states emerged as independent nations. As fifty years of Cold War ended, and ended rather unexpectedly and abruptly, the world faced a daunting task of answering the ultimate question. What would come next?

One political scientist suggested an answer to the question. “The great divisions among humankind and the dominating source of conflict will be cultural… the clash of civilizations will dominate global politics.”[1] Unlike the independence movements that plagued international stability in the eighteenth, nineteenth and twentieth century, the twenty-first century has seen a greater surge of culturally driven conflicts, some contained to rhetorical mudslinging, and some violent, bloody, and devastating to the peoples who get in the way of power seeking individuals who achieve dominance through the spread of chaos. The rise in cultural conflict has grown during the last decade and it threatens both stable and week nations alike. It is not limited to the traditionally war-torn regions of the world, and it will take cooperation to counter it. Like the great nations that faced the chaos of the French Revolution and the Napoleonic Wars, the nations of today must find a way to combat this growing crisis; a way that recognizes that the chaos is the goal of the enemy and not simply a byproduct.

 

 

Further Reading

Samuel P. Huntington,  The Clash of Civilizations and the Remaking of World Order (New York: Simon & Schuster, 2011).

 

End Notes

[1] Gideon Rose,  ed. The Clash at 20, E-book (Foreign Affairs, 2013), Foreignaffairs.com.

 

Ideology, Revolution, and Change: A Slow Process

On July 4, 1776 the Declaration of Independence was proclaimed to the people of Philadelphia, “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness…” Eleven years later, the Constitution of the United States of America was created, reaffirming the goal to “…establish Justice, insure domestic Tranquility, provide for the common defence(sic), promote the general Welfare, and secure the Blessings of Liberty…” In 1789, the congress defined twelve common rights of U.S. citizens but only 10 of these became amendments to the constitution. The Bill of Rights defined what the Declaration had not; it defined which rights could be agreed upon as the unalienable rights of man. At the heart of these rights was the belief that sanctity of thought and property were key to liberty.

Beginning in the 1760s, arguments were made that government should not impinge upon these basic rights. Property was not to be surrendered unless it was done so willingly or due to the judgement of one’s peers. It was felt that the forfeiture of property was tantamount to the loss of liberty. While the social strata of the colonies was less structured than in the Old World, property was still closely associated to one’s identity and stature. The loss of property, even from taxation, was considered highly serious in nature. Laws impinging on property rights and laws which changed the colonial judicial system led most often to non-violent protestations. In many cases the laws were repealed, but they were followed by new laws equally objectionable to the colonists. During the decade leading up to the American Revolution and throughout the years of warfare, an ideology emerged that defined political representation as a fundamental right of the citizen. This was not a new ideology, but one that became well-articulated during the numerous debates of the revolutionary period. By the time the U.S. Constitution was drafted, the notion of a government “of the people” was becoming firmly planted in the American psyche. The Preamble stated, “We the people” rather than “We the states”. The new nation was formed with the people being the highest political unit rather than the states. In 1863, during a bloody civil war, President Abraham Lincoln delivered the Gettysburg Address in which he reiterated that the nation was a “government of the people, by the people, for the people”. The American Civil War tested the strength of the constitution and the union it had created. However, it also highlighted that even after more than half century, the ideology that had declared the equality of man and the right to political representation had not become a universal reality in the United States and its territories. It would not be until the twentieth century that all U.S. citizens would gain the right to vote, and the protection to vote without constraint due to the lack of property or social standing.

The American Revolution had not been fought with the intent to change the social dynamics of the people, but the ideology that was established through decades of debate both before and immediately after the Revolution would eventually lead to social change. In the United States this social change was slow, sometimes terribly slow and with human suffering the consequence, but with slow change came stability. While many revolutions would follow in the footsteps of the American Revolution, few of the political entities formed from those revolutions witnessed the longevity and stability of that the United States with its slow and never-ending process of ensuring “Life, Liberty and the pursuit of Happiness” for its people.

History: Context Matters

King George III of Great Britain and James Otis, Jr. shared more in common than merely being characters in what would develop into the American Revolution.[1] In 1762, Otis argued against the legality of the writs of assistance established by George III.[2] At the time, Otis was a well-respected member of his community, and his words would go on to inspire the very men history books would refer to as the founding fathers of the United States. George III was still new to his reign as the king and, although facing some criticism in London, was not yet showing signs of the mental illness that would plague him later in life. Both men, believing themselves to be rational and empowered to act on behalf of their fellow men through either birth or through education and profession, took differing stands on the issue of the constitutionality of the king’s authority to impose restrictions on the people of the American colonies. In short, the king was certain of his authority and Otis was certain that the king was acting without full understanding of the unconstitutionality of his actions. In 1762 neither man was suffering from mental instability, but that would not be the case a few decades later. If their words and deeds were taken out of the context of the day, with the mental instabilities both suffered later in life being attributed to their earlier actions, the interpretation of those actions would be marred and history would not be served. Context matters in the study of history. Even the most praiseworthy individuals will have said or done something that, when taken out of context, will seem to contradict how history has recorded their character.

Endnotes

[1] Otis (1725-1783), George III (1738-1820).

[2] Writs of Assistance Case

Further Reading

A Vindication of the Conduct of the House of Representatives of the Province of the Massachusetts Bay, More Particularly, in the Last of the General Assembly by James Otis, Jr.

Project Gutenberg version

Scanned copy of the pamphlet (automatic download) from JamesOtis.net (link to document)

Protest Turned into War

On April 19, 1775, armed protest turned into war. After more than a decade of verbally protesting the increased restrictions placed upon what had been traditionally self-rule in the colonies of British North America, the colonists turned to a show of force as a means to convey their protest. By some accounts the militia of Lexington, MA had assembled to “exercise” in a series of military drills. Yet having been warned in advance of the British troops marching in search of a rumored arms cache, the militia clearly had assembled in a show of defiance. Upon being ordered to disperse both by the British and by one of their own, a shot was fired from an unidentified gunman and chaos erupted leaving men wounded and dying. Certainly calmer voices must have cried for peace, but history has recorded the cries for war that quickly rose up in response to the military action that began in Lexington and escalated in Concord. After meeting with greater resistance upon reaching Concord, the British troops were ordered to return to Boston. Their retreat back to Boston became both an opportunity for reprisal, and the basis for propagandists’ portrayals of victory against the hated oppressors. These British soldiers had been amassed in the colonies not with the intent of protecting the colonists from an enemy, but rather to police the colonists and put an end to smuggling, bribing of officials, and mob violence against those who tried to enforce the laws. The colonists had reason to be angry and dissatisfied with the means taken by the King and parliament to enforce laws which had been created in London but enacted an ocean away in the colonies. Yet at the same time, lawlessness, particularly in relation to the importation and exportation of goods, had been on the rise. The British subjects of North America, most who hadn’t sought to break ties with their motherland, at least not prior to that April day when the first shots of war were fired, had been living in a state where lawlessness and rebellion had been on the rise. They desired a peaceful return to the days before anger over taxation dominated the discourse, but they had entered into a spiraling cycle of action and reaction that led only to the path of war.

Gowing to War: Purpose and a Plan

In continuation with last week’s post about the study of the motivations of war, I decided to revisit something I wrote a couple years ago.

The Spanish-American War and subsequent Philippine War were short wars by U.S. standards but had far reaching consequences. President McKinley’s “limited war strategy” was intended to gain independence for Cuba but its limited scope also included a limited understanding of the consequences of international conflict.[1] Simply put, the United States was unprepared for war. While the navy was somewhat prepared, the army struggled under continued state and congressional opposition to a strong peacetime military force.[2] As with the American Revolution and the Civil War, untrained volunteers, “who fancied they were soldiers because they could get across a level piece of ground without stepping on their own feet,” were mustered and sent to war with little opportunity for training.[3]

Lack of preparation was one of the issues faced during the “splendid little war.” Of greater issue was the lack of a clear objective for war. If independence was the objective, then it would have seemed logical for the United States to have had greater respect for the native rebels who had worn down the Spanish forces before the U.S. arrival. Rather than respecting and aiding the rebel effort, the United States went from liberator to conqueror and rejected the notion of revolution and self-governance. Rather, the United States implemented a paternalistic imperial rule over the former Spanish colonies. Although there would be efforts at nation building and promises of self-rule, economic and military dependency became the reality.

Whatever goals President McKinley might have had in justifying war, they seem to have gone with him to his grave.[4] While Cuba would achieve a semblance of independence once the war ended, the Philippines would find itself embroiled in further war and facing an arguably unwanted annexation. The United States would become an empire by default more than by plan. McKinley’s little war would also have unexpected, long-term consequences on U.S. military strategy.

The Spanish American War and the Philippine War which created a new empire, would encourage future generations to believe that a guerrilla opposition could be snuffed-out with enough oppression, pacification, and force. While McKinley had not recognized the nature and consequences of international war coupled with imperial occupation, later presidents would justify future international wars based on the perceived successes of these conflicts. Only after it was too late would they realize that occupying islands cut off from allies and supplies was and easier task than occupying lands connected to supply networks. In a time when photographic war journalism was in its infancy, and the atrocities of war could still be ignored by civilians in the United States, pacification policies, total suppression of civilians and combatants, and a torched earth policy could subdue an enemy without public outcry. The United States would learn eventually that people may cry for war when national interests are at risk, but they have little stomach for war or the devastation war brings when faced with the brutal reality of war. Former U.S. secretary of state and retired general Colin Powell once said, “War should be the politics of last resort. And when we go to war, we should have a purpose that our people understand and support.”[5] More importantly, a nation should only go to war when the president understands the clear purpose of the proposed war and has weighed the consequences, short-term and long-term, thoroughly.

Endnotes

[1] Allen R. Millett and Peter Maslowski, For the Common Defense: A Military History of the United States of America. rev exp. (New York: Free Press, 1994), 286.

[2] Ibid., 303.

[3] Ibid., 290.

[4] Brian McAllister Linn, The Philippine War, 1899-1902 (Lawrence, KS: University Press of Kansas, 2000), 3.

[5] Tim Russert, “Powell’s Doctrine, in Powell’s Words.” The Washington Post, October 7, 2001. http://www.mbc.edu/faculty/gbowen/Powell.htm (accessed September 11, 2012).

Unexpected Consequences: Revolution

Prior to the twentieth century, war was most often the product of the elite rather than the common man. Assuredly, war had an impact, both direct and indirect, on the laborer. Whether from conscription, taxation, or proximity to the combat and the combatants, war could wreak havoc. War could also quickly change boundaries and cause forced changes in allegiance. Entire regions could become disputed territory as powerful states weakened and weaker states grew strong. The chaos of the French Revolution and the Napoleonic Wars led the rulers of Europe to seek a balance of power that would prevent the outbreak of wide spread war. For approximately a century they succeeded in quelling the rising nationalistic zeal that threatened to reignite the flames of world war. However, revolutionary ideologies were not contained even as rulers tried to contain revolt. While notions of self-determination, democracy, and equality were discussed by liberal minded thinkers, the ruling class held fast to the notion that not all men were ready or capable of self-rule. In some cases, outright racism was the justification for the continuation of imperial dominance and all the ills that imperialism wrought on subjugated peoples. In other cases, benign paternalism justified policies that increased inequality and protected the status quo. Regardless of the grand rhetoric of the time that promoted equality and brotherhood, paternalistic elitism, the belief that some were better suited to govern than others, remained the consensus of the day.

As the twentieth century dawned, changes in society due to industrialization were creating unrest. The outbreak of World War I ratcheted up the change. Women went to work in greater numbers, particularly women who belonged to the middle class.  Men, who had once been viewed as expendable laborers, became a valuable commodity. Total warfare left no civilian untouched and caused soldiers to question the futility of war. As fighting dragged on and depravation increased, patriotic citizens on the battlefield and home front struggled to find justification for the continued support of a war that seemed less and less justifiable.

In Russia, the seeds of revolution found fertile ground as the citizens lost faith in an old system that seemed to bring endless suffering. Elsewhere the notions of liberty, self-determination, and equality caused subjugated peoples to question why they should remain the possessions of rulers in distant lands rather than be allowed to govern themselves. While Allied nations fought to prevent the invasion, subjugation, and annexation of small nations like Belgium and prevent territorial losses in France, the same nations clung fast to their territorial holdings in other regions of the world. The brutality and futility of total war also caused many within Europe to question whether the empires that governed them did so with any consideration for their needs and their security. Ethnic unrest, nationalistic zeal, and distrust for those with different cultural habits increased as the war continued. The seeds of revolution were cast wide, some to find fertile ground immediately and others to remain dormant for decades, but all to produce the fruit of conflict and bloodshed. Revolution was not the goal of those who declared war in 1914 but revolution was the unexpected consequence.

Diplomacy and Destiny

It has been said that war is politics by other means and few would disagree with the Clausewitzian sentiment, but one might also state that diplomacy is warfare by peaceful means. Often diplomacy seeks to gain without violence the same objectives that empires of old sought to gain through war. Relying upon Machiavellian precepts of being feared rather than loved, and by justifying the means by the end results, great diplomats have doggedly pursued national interests, sometimes believing destiny had already prescribed a greater future than present circumstances provided. One such diplomat was William Henry Seward (1801-1872). In 1853, seven years before becoming U.S. Secretary of State for the Lincoln administration, Senator Seward stated in a speech titled The Destiny of America, “Nevertheless it is not in man’s nature to be content with present attainment or enjoyment. You say to me, therefore, with excusable impatience, ‘Tell us not what our country is, but what she shall be. Shall her greatness increase? Is she immortal?’”[1] Steward believed the answer to these questions were the affirmative and would spend his career seeking to increase the greatness of the nation he served.

Like other expansionists, Seward would link U.S. commercial strength with the acquisition of foreign markets and territorial holdings. When Mexico and British Canada proved unfertile soil for acquisition, Seward looked elsewhere. Seward believed that the United States had a destiny to spread its notions of liberty to the new nations breaking free from European imperialism, particularly those liberating themselves from Spain. Unfortunately, he also believed, as many did, that shaking off imperial control did not necessarily mean the people of Latin America were prepared to self-govern.[2] Seward believed the southern neighbors would be better served if they became part of the United States. Seward achieved a piece of his goal by pushing for the purchase of Alaska, and while it was considered folly at the time, the discovery of gold changed how most viewed the acquisition. He had less success in his efforts to secure other territories in the Caribbean and Central America. However, he would be remembered for the tenacity with which he sought U.S. expansion; a tenacity that often diverged from diplomacy and bordered on bullying.[3] Those who were unfortunate to have sparred with Seward would have felt bombarded and under attack, and would have wondered at the fine line Seward drew between diplomacy and war. With a focus firmly on the destiny of U.S. greatness, Seward behaved more like a commanding general than a diplomat. Seward believed the destiny of the United States was not limited to contiguous land of North America, but that it reached far beyond. Eventually Steward’s tenacious diplomacy would be replaced by combat in a war that would acquire some of the territory Seward had desired. His vision of U.S. expansion, while not achieved during his time in office, did influence the direction of U.S. expansion as the nineteenth century drew to a close. Whether through diplomacy or warfare, men like Seward were determined to see the United States fulfill its destiny of greatness.

Endnotes

[1] Frederick Seward, Seward at Washington as Senator and Secretary of State: A Memoir of His Life, with Selections from His Letters, e-book (New York: Derby and Miller, 1891), 207.

[2] William Henry Seward, Life and Public Services of John Quincy Adams Sixth President of the United States with Eulogy Delivered before the Legislature of New York, e-book (Auburn, NY: Derby, Miller and Company, 1849), 122-123.

[3] George C. Herring, From Colony to Superpower: U.S. Foreign Relations Since 1776 (New York: Oxford University Press, 2008), 255-257.

Empires and Keeping the Peace

It is clear that as the European empires struggled to maintain control over their colonial possessions during the nineteenth and twentieth centuries, the United States searched for footholds in the regions formally under European control. In the nineteenth century, the United States expanded westward and southward absorbing territory which had been held, often loosely, by Spain. In some cases, the United States annexed regions which became part of the union. In other cases U.S. businessmen, or filibusters, simply moved in and dominated the local economies. Due to the Napoleonic Wars and political shifts in Europe, little by little, European interests, or the ability to capitalize on the interests, in the Americas dwindled. Even Great Britain, the great empire of the 1800s, intensified its focus on developing colonial markets in Africa, India, and China rather than the Americas. Certainly the United States was not left alone in the Americas, but it was able to expand its sphere of influence, especially economic influence with greater ease during the nineteenth century and early twentieth century.

While the War of 1812 did not gain the United States territorial holdings in Canada as some had hoped it would, it did establish that the United States was willing to use war as a means to expand, even war with European powers.[1] During the decades following the war, the United States made it very clear to Europe that it intended to be the regional power in the Americas, and that it would not tolerate European interference. Great Britain was actively expanding and defending its worldwide empire, but U.S. Secretary of State John Q. Adams was determined to prevent Great Britain from taking advantage of Spain’s weakened control of territories in the Americas. In a debate with British Minister Stratford Canning in 1821, Adams pointed out that Britain was seeking to gobble up the world markets, even quipping that Britain might have designs on “a piece of the moon.” When the debate circled around to the question of whether the United States still had designs on Canada, Adams replied, “Keep what is yours and leave the rest of the continent to us.”[2]

Shortly after this debate on the expansion of spheres of influence and territorial acquisition, Adams drafted what would become known as the Monroe Doctrine. He encouraged President Monroe to take a bold stand on the issue of European interference in the Western Hemisphere. It did not matter whether it was Great Britain, France, or Russia who had their sights set on a piece of the Americas. The United States declared that it would act to prevent the further exploitation of the Americas by Europe, but that did not mean it would not exploit the Americas for its own benefit. Nor did it mean that it would not seek to spread U.S. political and economic influence beyond the Americas.

During the decades preceding the War of 1812, revolution and independence movements disrupted imperial control, but in the years preceding World War I, a rise in nationalist revolutions set the stage for the demise of the great empires of the previous centuries. Colonialism would be challenged and eventually, after a second world war, eliminated in its previous form. Yet, even as the sun was setting on the colonial system which had helped create the empires of the past, a new colonial structure began to emerge. While the world focused on the war raging in Europe, President Wilson was flexing U.S. muscle in the Americas.[3] Neocolonialism became the policy of a new and emerging empire – the United States. Maybe not an empire in the traditional sense, but an empire in how it used its influence to set economic and political policy favorable to its own national interests rather than the interests of the neighbors it policed. As the great empires of Europe warred and their colonial control declined over the resource rich regions which had once made them powerful, the United States (and later the Soviet Union) expanded a growing economic and political sphere of influence that would rival any traditional empire.  In some cases, the sphere would include a military presence or intervention to keep the peace. This was not necessarily a peace that benefited the citizens in the new nations emerging in the wake of decolonization, as much as it benefited U.S. economic interests; but as the United States would remind itself from time to time, peace even a forced peace, was better than war. When the forced peace protected U.S. economic growth and stability, forced peace would certainly, at least to the United States, be the lesser evil, even if it made the United States seem very much a twentieth century empire.

 

Endnotes:

[1] Walter T. K. Nugent, Habits of Empire: A History of American Expansion, (New York: Alfred A. Knopf, 2008), 73-74.

[2] George C. Herring, From Colony to Superpower: U.S. Foreign Relations Since 1776, (New York: Oxford University Press, 2008), 134.

[3] Herring, 386.