Tag Archives: International Relations

Obligated to Intervene

In 1820, the Congress of Troppau was convened. The great powers of the day determined that they held the right to intervene in the revolutionary conflicts of neighboring states. Maintaining the status quo and preventing the spread of nationalism and revolution was viewed as vital in the quest to quell the type of conflict that had erupted in Europe during the French Revolution and the Napoleonic Era. While the beginning of the century had been fraught with what some called the first worldwide war, the remainder of the century saw only regional conflicts, most that were harshly quelled before they could spread outside their borders. However the policy of intervention did not quell nationalism. During the twentieth century nationalism would be at the heart of many conflicts, and the notion that great nations had the right to intervene to protect the status quo would be at the center of international policy for many nations including the United States.

In the case of the United States, intervention became a tool to either protect or disrupt the status quo in a region depending on which was most beneficial to interests of the United States. Intervention often placed the nation at odds with its own revolutionary history and patriotic rhetoric. Despite seeming hypocritical in nature, the United States was not forging new diplomatic patterns but rather following the patterns established by the great powers of the past. The U.S. Founding Fathers may have wanted to distance themselves from the politics and practices of Europe, but their decedents embraced the policies as the United States rose to international supremacy during the twentieth century.

During the rise to superpower status, the United States benefited economically and politically. The right to intervene allowed the United States to protect economic markets, and in some cases add new markets and resources to its growing stock pile. While the nation doggedly denied that it was an empire, by the end of the twentieth century the problems associated with empires began to plague the nation. Most prominently, it could be argued, the United States faced the growing international expectation that it would intervene when conflict threatened a region’s status quo. After a century of gaining prominence and wealth through international intervention, often with the sole goal of protecting resources and markets, the United States found that the right to intervene had transformed into an obligation to intervene.

Power and Chaos

Prior to the chaos of the French Revolution and Napoleon’s meteoric rise to power, three great powers balanced the Western World: Great Britain, France, and the Ottoman Empire. The Far East and the Americas were still peripheral, with only the United States disrupting the colonial empire system in any fundamental way during the eighteenth century. Throughout the nineteenth century, the three great empires faced ever-growing challenges as nationalistic zeal spread worldwide. In response to the chaos created by the both the French Revolution and the Napoleonic era, the great powers of Great Britain, Austria, Prussia, and Russia chose to form an alliance that they hoped would prevent a repeat of the decades of war. They also redoubled their efforts to contain and control their own territories. The great threat to political stability came from two entities: empire seekers and nationalistic zealots. Control and contain both, and it was believed that chaos could be avoided. Yet as well conceived as the Concert of Europe was for the age, there was an inherent flaw in the concert system. The very nature of forming alliances to prevent imperial expansion or nationalistic revolution also entangled the great nations, and would, in the early twentieth century, lead them into another great international conflict. Fear became the demon; fear of what would happen if a nation chose not to honor the treaties and pacts.

The twentieth century saw the rupture of empires and the colonial system that had made the empires great. While the rupture was often bloody and chaotic, there remained a level of control because as the great empires of the past declined, two even greater empires replaced them. Historians and political scientists argue over whether these two great nations ever became empires in the true sense, or if they were only empires of influence during the second half of the twentieth century. They do, however, agree that the influence of the United States and the Soviet Union during the Cold War suppressed a great deal of the chaos that might have erupted as colonial shackles were lifted and fledgling states emerged as independent nations. As fifty years of Cold War ended, and ended rather unexpectedly and abruptly, the world faced a daunting task of answering the ultimate question. What would come next?

One political scientist suggested an answer to the question. “The great divisions among humankind and the dominating source of conflict will be cultural… the clash of civilizations will dominate global politics.”[1] Unlike the independence movements that plagued international stability in the eighteenth, nineteenth and twentieth century, the twenty-first century has seen a greater surge of culturally driven conflicts, some contained to rhetorical mudslinging, and some violent, bloody, and devastating to the peoples who get in the way of power seeking individuals who achieve dominance through the spread of chaos. The rise in cultural conflict has grown during the last decade and it threatens both stable and week nations alike. It is not limited to the traditionally war-torn regions of the world, and it will take cooperation to counter it. Like the great nations that faced the chaos of the French Revolution and the Napoleonic Wars, the nations of today must find a way to combat this growing crisis; a way that recognizes that the chaos is the goal of the enemy and not simply a byproduct.

 

 

Further Reading

Samuel P. Huntington,  The Clash of Civilizations and the Remaking of World Order (New York: Simon & Schuster, 2011).

 

End Notes

[1] Gideon Rose,  ed. The Clash at 20, E-book (Foreign Affairs, 2013), Foreignaffairs.com.

 

The Good Old Days

Memory is a tricky thing that tends to filter events by removing the negative aspects from our recollection. When current events are not to our liking, we look to the past and remark on how much better the past was in comparison to the present. While it is also true the positive aspects of an event or period of time can be filtered leaving us with only a bleak recollection of the time, it is more often the case with collective memory that we glorify rather than demonize the past. History, the record and study of that record, helps remove the myth that memory creates.

For many who came to maturity during the 1980s, the decade has come to represent a better time, or in other words, The Good Old Days. The decade is viewed as one where U.S. power and culture was strong and celebrated. The music and clothing were distinctive and memorable. Soft Power was used in conjunction with traditional methods of political power, and the influence of the United States was felt worldwide. The notion that the Cold War was won by forceful rhetoric and the exportation of McDonalds and MTV has resonated with those who now view the 1980s as the glorious decade of U.S. supremacy. While few will argue against the notion that the United States reached a superpower zenith as the twentieth century neared its end, historians will be quick to note that there was more to the decade than glory and power. There was fear – fear of nuclear destruction, fear of pandemic spread of disease, and fear of an ever increasing drug use in mainstream society. However in a decade where politicians could harness the media, or at least greatly influence the script, and where social media was yet unborn, it was easy for the general public to hear the strong rhetoric and believe the message. Imbedded in the rhetoric was the notion that war was the answer to all the ills that plagued the nation. Whether an ideological war with an evil enemy, a hot war often conducted in secrecy, or a war on drugs that often impinged on civil rights but had a moral justification, war was the solution. War was also the solution to a lagging economy. Investment into the machines of war burdened the nation with debt, but it also put people to work and made a select group wealthy in the process. War and power went hand in hand, and those who viewed power as the ultimate evidence of success sought to encourage and perpetuate the notion that only through the constant demonstration of strength could the fears of a nation be quelled. Decades later their efforts have caused many to look back in longing for a better time – a time of strength.

Memory is a tricky thing. Few in the public participated directly in the world changing events of their youth, and fewer still have found a need to crack open the history books to learn more about period of time in which they lived. Historians seek to delve beyond collective memory and search for the data that reveals a greater image of the people and events of a period of time. For those who seek to understand the history rather than the myth of the 1980s, The Good Old Days were days of rhetoric and war, a nation recovering from an economic recession, and a time when money equaled political power. So, in a way, those days are not so dissimilar to the present.

 

 

Further Reading

Chollet, Derek, and James Goldgeier. America Between the Wars: From 11/9 to 9/11; The Misunderstood Years Between the Fall of the Berlin Wall and the Start of the War on Terror. New York: PublicAffairs, 2008.

Gaddis, John Lewis. We Now Know: Rethinking Cold War History. Cambridge, MA: Oxford University Press, 1997.

Leffler, Melvyn P., and Jeffrey W. Legro, eds. In Uncertain Times: American Foreign Policy after the Berlin Wall and 9/11. Ithaca, NY: Cornell University Press, 2011.

Saull, Richard. The Cold War and After: Capitalism, Revolution and Superpower Politics. London: Pluto Press, 2007.

 

Big Talk or Quiet Diplomacy

In June of 1987, U.S. President Ronald Reagan stood at the Berlin Wall and demanded that Soviet President Mikhail Gorbachev “Tear down this wall!” When just a few years later the wall was breached and then torn down by the people, many in the United States credited Reagan with a victory. While the specific role of the United States in the collapse of the Soviet Union is a hotly debated topic, what is clear to historians is that Reagan’s rhetoric was not the cause of the Fall of the Berlin Wall. However, his dedicated efforts to work diplomatically with Gorbachev, even to the point of becoming friends, can be viewed as integral to the end of the Cold War. Normalization of relations was not something that either leader took lightly, especially after the near disaster that was only narrowly avoided during the Able Archer exercises in 1983.

While some historians will argue that Reagan did not dramatically change his policy after learning of the near disaster, others believe that he became more open to diplomatic discourse in a desire to avoid nuclear war. In either case, the notion that Reagan’s big talk was key to a campaign of intimidation that directly led to the end of the Berlin Wall and the ultimate end of the Soviet Union is on the whole founded on myth rather than reality. Unfortunately, it is a myth that became firmly rooted in a generation who now view diplomacy as being weak and shouting as being effective. Big talk may have a place in foreign policy, but it is not the key to success that so many believe it to be. Quiet diplomacy on the other hand, while seldom making the news, has a more lasting impact current affairs.

 

Further Reading

Fischer, Beth A. The Reagan Reversal: Foreign Policy and the End of the Cold War. Columbia, MO: University of Missouri Press, 1997.

Gaddis, John Lewis. The United States and the End of the Cold War: Implications, Reconsiderations, Provocations. New York: Oxford University Press, USA, 1992.

Hutchings, Robert L. American Diplomacy and the End of the Cold War: An Insider’s Account of US Diplomacy in Europe, 1989-1992. Washington, DC: The Johns Hopkins University Press, 1997.

Idealism versus Realpolitik

Machiavelli advised, “… never in peaceful times stand idle.”[1]

The newly formed United States was idealistic in its desire to separate itself from the conflicts of Europe. Many believed that foreign nations would wish to maintain peaceful relations with the United States in order to obtain the vast raw materials provided by the new nation. The idealism of the founding fathers was challenged, both by the French and the British well before the nation had reached its fiftieth birthday. In order to secure the economic benefits of international trade, the nation had to be prepared to handle international conflict and intrigue. As the Adam’s administration quickly discovered in the late 1790s, this would mean investing in the military, particularly in the navy. The idealistic notion of ‘free trade’ among nations had turned out to be anything but free. While the United States had found a diplomatic solution with Britain, albeit a temporary one, their solution raised the ire of the French and led to what became known as the Quasi War. The United States faced the harsh reality that in order to become economically strong, it would also need to become militarily strong. In a world dominated by realpolitik, idealistic notions such as ‘freedom of the seas’ were viewed as naïve more than noble.

Endnotes

[1] Niccolò Machiavelli, The Prince, trans. W. K. Marriott (Superior Formatting Publishing, 2010) Kindle.

 

 

Further Reading

Grey, Edward. “Freedom of the Seas.” Foreign Affairs. Last modified April 1930. https://www.foreignaffairs.com/articles/oceans/1930-04-01/freedom-seas.

Fehlings, Gregory E. “America’s First Limited War.” Naval War College Review 53, no. 3 (Summer 2000): 101.

 

 

Sword Rattling and Stability

Current world events have again highlighted historic tendencies, in particular the tendency of great nations to deflect attention from their own unpopular policies by bringing attention to the unpopular policies of others. Often times this action can lead to a great deal of sword rattling and a call for intervention or peacekeeping efforts. During the Cold War the United Nations was hobbled by competing spheres of interest and was prevented from taking action in areas dominated by the superpowers, particularly in the ‘backyards’ of the United States and the Soviet Union. While the Cold War has ended, the international community still finds itself constrained when conflict erupts in a powerful nation’s backyard. As current events focus attention on Russia and Ukraine, it is interesting to look back at a time when the United States placed regional stability over a nation’s sovereignty.

On June 20, 1954, the United Nations held an emergency Security Council meeting to consider an appeal made by the Guatemalan government claiming that Guatemala had received hostile treatment from exterior sources and was under threat of invasion. The Soviet Union supported an investigation, France and Great Britain believed the United Nation had authority to investigate and were supportive of an investigation, but the United States was set against any UN involvement. The U.S. ambassador to the United Nations, Henry Cabot Lodge, Jr., stated, “Stay out of this hemisphere and do not try to start your plans and conspiracies over here.”[1] While his words were directed to the Soviet Union, his message was received by all.

In her article “From Civil War to ‘Civil Society’: Has the End of the Cold War Brought Peace to Central America?” Jenny Pearce wrote the following statement.

“The United States’ historic lack of interest in what it dismissively referred to as its ‘backyard’, and its concern with stability first and foremost, meant that the exclusionary dynamic of the years of post-Second World War growth in Central America, at both the political and the economic level, was deemed of little importance.”[2]

Pearce was correct in her assessment that “stability” was “first and foremost” in U.S. consideration. Nationalist reform, economic growth, and political ethics were of little concern to the United States during the Cold War, at least in its ‘backyard’. Stability meant keeping the status quo, and the United States was willing to work with dictators if said dictators kept any and all vestiges of communism out of the region, or in other words, remained friendly to the United States.

The Guatemalan request made to the UN Security Council was handed off to the Organization of American States (OAS) where it received little to no actual investigation but rather generated a counter accusation that Guatemala was a regional security risk because it had permitted a communist party to formally establish. Within just a few days of the UN emergency meeting, President Arbenz of Guatemala resigned due, in large part, to the invasionary force that had crossed the border in to Guatemala; a force supplied, trained, and supported by the CIA.

In the sixty-one years since the crisis in Guatemala much has changed in the world. However when it comes to the backyard of a powerful nation, the international community is still resistant to challenge regional hegemony. Stability in a region, albeit a stability by force, often speaks louder than any sword rattling or resultant calls for intervention.

 

 

 

 

[1] Richard H. Immerman, The CIA in Guatemala: The Foreign Policy of Intervention (Austin: University of Texas Press, 1982), 171.

[2] Jenny Pearce, “From Civil War to ‘Civil Society’: Has the End of the Cold War Brought Peace to Central America?” International Affairs 74, no. 3 (July 1998): 593. http://www.jstor.org/stable/2624971 (accessed September 15, 2013).

History: A Team Sport

In a recent interview Noam Chomsky, political commentator and social activist, made the following statement, “When the US invades… kills a couple hundred thousand people, destroys country… – that’s stabilization. If someone resists that attack – that’s destabilization.”[1] This statement, although controversial in nature does highlight a problem so often encountered during the general study of history – history from the perspective of the strong and victorious, or in the post-Cold War age, history from the perspective of one’s favorite team.

Traditionally history was recorded by the victor. The objectives of the victor were portrayed as strong and virtuous and the defeated were portrayed as weak and morally inferior. Over the centuries the advancement of technology allowed for a greater record of history to be kept. In addition to formal books recording the history of famous men and battles, newspapers and personal journals acted as the repositories of historical data. These documents were simply waiting to be mined for the valuable information that would then be included in some historical tome. In the modern world, it seems that everything is being recorded, even if not all things are noteworthy or have any likelihood of making their way into a historical study. Yet even with the plethora of data now available to historians, history is still being written by the strong and powerful, whether it be nations or people. Scholars may work to mitigate the efforts of propagandists and publicists, but the general perception of current events is being colored by sensational hype, and recent history is being distorted often by a sense of patriotism or loyalty. The notion that the history making people or events must be categorized either as good or bad, and that the public must then draw up sides, like for some global team sporting event, perpetuates the problems of creating a valid comprehensive record of history. During the decades of the Cold War, people found it rather easy to choose sides, unless of course they lived one of the many newly decolonized nations. These people often found themselves courted and coerced by the superpowers, with their hopes for stability threatened by the opposing teams whose real aims had little to do with stability and had much more to do with simply beating the other side. The Cold War was unique in scale and scope but the tendency for people to choose sides was not. People desire belonging to a group and desire to victory over defeat. Most importantly, people desire justification and acceptance for their choices and actions. Even those who end up on the loosing team wish to be remembered as having been justified in their fight, even if their justification was misguided or their motivation was less than noble.

History is not always kind, and compressive history is seldom a record of winners and losers. Sometimes the most memorable players were not on the winning team and often the winning team was less than honorable in their actions, even if their intent was virtuous. Fans of history can become entrenched in feelings of loyalty and struggle to embrace opposing views, particularly when opposing views criticize their team. Historians are tasked with the challenge of avoiding anachronistic tendencies and personal bias, knowing fully well that even as they attempt to provide a balanced study of history, their audience may have already chosen their favorite team and will not be budged.

[1] Chomsky: “US Invades, Destroys Country – That’s Stabilization. Someone Resists – Destabilization’, 2015. Accessed April 19, 2015. https://www.youtube.com/watch?v=r-QFDX7mLqM&feature=youtube_gdata_player.

 

A Defining Moment in History: Appomattox Court House

One hundred and fifty years ago General Robert E. Lee surrendered to General Ulysses S. Grant at Appomattox, Virginia, thereby signaling the end of a long and bloody war which had been fought over the question of whether a political state had a right to leave a union it had voluntarily joined. History books enumerate the varied and difficult sociopolitical causes that led to the rupture of the United States and the devastating effects the rupture had on the people and the nation. At the heart of the rupture was a central question, was the United States a federation or a confederation? Modern attempts to clarify the difference between the two types of union focus on a key difference between otherwise similar political institutions – voluntary entry and the notion that a sovereign state that enters voluntarily should in turn be able to freely exit if it chooses. In the nineteenth century, the American Civil War seemed to have determined that the United States had become a federation upon the signing of the U.S. Constitution and the dissolution of the union would not be tolerated. In the twentieth century, other confederations would seek dissolution with varied levels of success. Arguing that they had never agreed to become a federation but had only agreed to a loose confederation, political states like Slovenia managed to declare and achieve independence. Not all attempts by small political states belonging to larger political unions succeeded in achieving both independence and international recognition without first engaging in prolonged civil wars. During the American Civil War and the many civil wars of the twentieth century, the international community, particularly the great powers, felt the need to intervene usually, but not always, on the side which sought to protect the status quo. Interestingly, as important as maintaining the status quo and suppressing war might have been to the great powers, occasionally they would see value in the breakup of large unions regardless of whether the unions were federations, confederations, or empires.  In the end, it seemed to matter little how a union was defined, rather what mattered was how the other great nations could best benefit economically, politically, and socially. In the case of the American Civil War, the great powers decided that intervention would be too costly. Keeping peace in Europe was enough of a problem without throwing support behind a bunch of rebel states wishing to form a separate, more loosely bound union. Prior to April 9, 1865, there had been a debate as to whether the United States was a confederation of states voluntarily joined and with the right to freely exit. With the defeat of the South, the debate should have ended. However, despite the fact that the events at Appomattox Court House comprised a defining moment in U.S. history, the distinction between federation and confederation did not seem to solidify, at least not when secessionist rhetoric finds a foothold.

When Buying Foreign Was in the U.S. National Interest

Historian Stephanie M. Amerian recently published an excellent article about the Marshall Plan and the U.S. government’s promotion of “buying European” in the years following the end of World War II.[1] It was of vital national interest for the citizens of the United States to spend money on European goods, to travel to European destinations, and to support the members of the European community of nations. If the U.S. didn’t spend its currency in Europe and on European manufactured goods, then a devastated Europe would not be able to purchase U.S. raw materials and finished goods.

Protectionism and isolationism had not been successful economic or political policies during Thomas Jefferson’s day when, as president, he supported an embargo as the means to pressure Great Britain. Nor had such policies been successful in combating the effects of recession, great or small, in the years between the Jefferson administration and WWII. The United States, while large and possessing a high level of self-sufficiency, was dependent on an international flow of trade as much as any other nation by the mid-twentieth century. Whether it was importing luxury items from distant lands or exporting raw materials to European manufacturing hubs, the United States had a history of benefiting from international trade and in defending the notion of free markets.

War had brutally destroyed infrastructure, manufacturing capability, and all but obliterated the purchasing power of the European nations. Consequently, U.S. manufactured goods and raw materials lost a huge portion of the international market due to the war. The United States, as a nation relatively undamaged due to the destruction of war, had the opportunity to lend a hand. Many politicians felt that in doing so, the United States could rebuild Europe following the U.S. model of capitalism and democracy. Economic support for Europe was seen as vital in preventing a third war from developing. Additionally, the United States was convinced that Soviet influence and expansion needed to be halted at Europe’s borders. Unfortunately, as the U.S. public became more aware of the Soviet threat, their support moved from lending a hand to supporting military buildup. Simply put, investment in military muscle could protect the United States and its friends but did not require knowledge of economic theory. Buying foreign might have made sense to the economist, but exporting the United States in all its various forms made sense to the common U.S. citizen.

 

Endnotes

[1] Stephanie M. Amerian, “‘Buying European’: The Marshall Plan and American Department Stores,” Diplomatic History 39, no. 1 (January 2015): 45, (accessed March 14, 2015), http://dh.oxfordjournals.org/content/39/1/45.

 

Further Reading

Belmonte, Laura A. Selling the American Way: U.S. Propaganda and the Cold War. Philadelphia: University of Pennsylvania Press, 2010.

Boyce, Robert. The Great Interwar Crisis and the Collapse of Globalization. Reprint edition. Basingstoke: Palgrave Macmillan, 2012.

Hoganson, Kristin L. Consumers’ Imperium: The Global Production of American Domesticity, 1865-1920. 1 edition. Chapel Hill: The University of North Carolina Press, 2007.

Mariano, Marco. “Isolationism, Internationalism and the Monroe Doctrine.” Journal of Transatlantic Studies (Routledge) 9, no. 1 (Spring 2011): 35–45.

“Embargo of 1807.” Thomas Jerfferson’s Monticello. http://www.monticello.org/site/research-and-collections/embargo-1807.

 

Gowing to War: Purpose and a Plan

In continuation with last week’s post about the study of the motivations of war, I decided to revisit something I wrote a couple years ago.

The Spanish-American War and subsequent Philippine War were short wars by U.S. standards but had far reaching consequences. President McKinley’s “limited war strategy” was intended to gain independence for Cuba but its limited scope also included a limited understanding of the consequences of international conflict.[1] Simply put, the United States was unprepared for war. While the navy was somewhat prepared, the army struggled under continued state and congressional opposition to a strong peacetime military force.[2] As with the American Revolution and the Civil War, untrained volunteers, “who fancied they were soldiers because they could get across a level piece of ground without stepping on their own feet,” were mustered and sent to war with little opportunity for training.[3]

Lack of preparation was one of the issues faced during the “splendid little war.” Of greater issue was the lack of a clear objective for war. If independence was the objective, then it would have seemed logical for the United States to have had greater respect for the native rebels who had worn down the Spanish forces before the U.S. arrival. Rather than respecting and aiding the rebel effort, the United States went from liberator to conqueror and rejected the notion of revolution and self-governance. Rather, the United States implemented a paternalistic imperial rule over the former Spanish colonies. Although there would be efforts at nation building and promises of self-rule, economic and military dependency became the reality.

Whatever goals President McKinley might have had in justifying war, they seem to have gone with him to his grave.[4] While Cuba would achieve a semblance of independence once the war ended, the Philippines would find itself embroiled in further war and facing an arguably unwanted annexation. The United States would become an empire by default more than by plan. McKinley’s little war would also have unexpected, long-term consequences on U.S. military strategy.

The Spanish American War and the Philippine War which created a new empire, would encourage future generations to believe that a guerrilla opposition could be snuffed-out with enough oppression, pacification, and force. While McKinley had not recognized the nature and consequences of international war coupled with imperial occupation, later presidents would justify future international wars based on the perceived successes of these conflicts. Only after it was too late would they realize that occupying islands cut off from allies and supplies was and easier task than occupying lands connected to supply networks. In a time when photographic war journalism was in its infancy, and the atrocities of war could still be ignored by civilians in the United States, pacification policies, total suppression of civilians and combatants, and a torched earth policy could subdue an enemy without public outcry. The United States would learn eventually that people may cry for war when national interests are at risk, but they have little stomach for war or the devastation war brings when faced with the brutal reality of war. Former U.S. secretary of state and retired general Colin Powell once said, “War should be the politics of last resort. And when we go to war, we should have a purpose that our people understand and support.”[5] More importantly, a nation should only go to war when the president understands the clear purpose of the proposed war and has weighed the consequences, short-term and long-term, thoroughly.

Endnotes

[1] Allen R. Millett and Peter Maslowski, For the Common Defense: A Military History of the United States of America. rev exp. (New York: Free Press, 1994), 286.

[2] Ibid., 303.

[3] Ibid., 290.

[4] Brian McAllister Linn, The Philippine War, 1899-1902 (Lawrence, KS: University Press of Kansas, 2000), 3.

[5] Tim Russert, “Powell’s Doctrine, in Powell’s Words.” The Washington Post, October 7, 2001. http://www.mbc.edu/faculty/gbowen/Powell.htm (accessed September 11, 2012).