Tag Archives: United States

A Defining Moment in History: Appomattox Court House

One hundred and fifty years ago General Robert E. Lee surrendered to General Ulysses S. Grant at Appomattox, Virginia, thereby signaling the end of a long and bloody war which had been fought over the question of whether a political state had a right to leave a union it had voluntarily joined. History books enumerate the varied and difficult sociopolitical causes that led to the rupture of the United States and the devastating effects the rupture had on the people and the nation. At the heart of the rupture was a central question, was the United States a federation or a confederation? Modern attempts to clarify the difference between the two types of union focus on a key difference between otherwise similar political institutions – voluntary entry and the notion that a sovereign state that enters voluntarily should in turn be able to freely exit if it chooses. In the nineteenth century, the American Civil War seemed to have determined that the United States had become a federation upon the signing of the U.S. Constitution and the dissolution of the union would not be tolerated. In the twentieth century, other confederations would seek dissolution with varied levels of success. Arguing that they had never agreed to become a federation but had only agreed to a loose confederation, political states like Slovenia managed to declare and achieve independence. Not all attempts by small political states belonging to larger political unions succeeded in achieving both independence and international recognition without first engaging in prolonged civil wars. During the American Civil War and the many civil wars of the twentieth century, the international community, particularly the great powers, felt the need to intervene usually, but not always, on the side which sought to protect the status quo. Interestingly, as important as maintaining the status quo and suppressing war might have been to the great powers, occasionally they would see value in the breakup of large unions regardless of whether the unions were federations, confederations, or empires.  In the end, it seemed to matter little how a union was defined, rather what mattered was how the other great nations could best benefit economically, politically, and socially. In the case of the American Civil War, the great powers decided that intervention would be too costly. Keeping peace in Europe was enough of a problem without throwing support behind a bunch of rebel states wishing to form a separate, more loosely bound union. Prior to April 9, 1865, there had been a debate as to whether the United States was a confederation of states voluntarily joined and with the right to freely exit. With the defeat of the South, the debate should have ended. However, despite the fact that the events at Appomattox Court House comprised a defining moment in U.S. history, the distinction between federation and confederation did not seem to solidify, at least not when secessionist rhetoric finds a foothold.

Change Came Quickly

In 1918, Fritz Haber was awarded the Nobel Prize in Chemistry. World War I delayed the presentation of the award because Haber was a German scientist, one who had gained the name ‘the father of chemical warfare’. Haber was a patriotic German committed to the German cause, however, less than fifteen years after he was celebrated as a great scientist, he fled his homeland fearing for his life. Fritz Haber was a Jew. He was also an intellectual who too closely associated with a war that had been lost rather than won. Like many other German citizens, Haber discovered that under the right set of circumstances hate could replace friendship with great rapidity. Those circumstances included an economic recession, a turbulent political climate, an abundance persuasive rhetoric, and a highly effective propaganda campaign. In less than two decades, a population who once celebrated Haber’s achievements turned their backs on the evidence that their government had implemented a policy of incarceration and extermination. Race, religious affiliation, sexual orientation, and intellectual interests were more than enough justification for the public to look the other way, or worse join the Nazi agenda. Change came quickly while the public clung to the notion that they were justified in their actions.

U.S. Compulsory Education: Teaching Exceptionalism

During the mid-nineteenth century, states began passing compulsory education laws, and although all states had these laws in place by the time the United States entered World War I, there was still quite a disparity between levels of basic education received by the soldiers. Mobilization efforts during WWI highlighted the need for greater emphasis on education in the United States, but it also highlighted the need to emphasize a common nationality among its citizenry. The war had created a stigma on citizens and immigrants who were too closely related or associated with the enemy. It was felt that the ‘old country’ culture, still held by many, needed to be replaced by a commitment to a less definable, but more patriotic American culture. The desire to eliminate overt connections with European culture, a culture that seemed to instigate war rather than peace, led to strong measures designed to force change in the U.S. population. One measure included the effort to eliminate parochial schools which were viewed as being too closely tied to European culture. When Oregon amended its compulsory education laws in 1922 with the intent to eliminate parochial schools, they faced opposition including a Supreme Court case that ended up ruling against them. It was hoped that public education would transform the population into a more cohesive culture, and while states couldn’t force public school attendance versus private school attendance, over time many states were able to dictate curriculum requirements and achieve the underlying goals sought by legislators during the post-war period.

Many in the United States believed that the nation had a vital responsibility to encourage and spread notions of republican democracy. A growing belief in ‘American exceptionalism’ developed in the post-war years, due in part to wartime propaganda. If the United States was to be exceptional then it needed to guarantee that its public understood what made it exceptional. Accomplishing this task meant that its citizenry needed to understand history, and not just the history of the United States beginning with colonization or independence, but a citizen needed to understand the connection between the United States and ancient history where the foundations of democracy resided. Compulsory education, classes in American History and Western Civilization, and an emphasis on U.S. exceptionalism became the foundation for unifying a nation during the twentieth century.

When Buying Foreign Was in the U.S. National Interest

Historian Stephanie M. Amerian recently published an excellent article about the Marshall Plan and the U.S. government’s promotion of “buying European” in the years following the end of World War II.[1] It was of vital national interest for the citizens of the United States to spend money on European goods, to travel to European destinations, and to support the members of the European community of nations. If the U.S. didn’t spend its currency in Europe and on European manufactured goods, then a devastated Europe would not be able to purchase U.S. raw materials and finished goods.

Protectionism and isolationism had not been successful economic or political policies during Thomas Jefferson’s day when, as president, he supported an embargo as the means to pressure Great Britain. Nor had such policies been successful in combating the effects of recession, great or small, in the years between the Jefferson administration and WWII. The United States, while large and possessing a high level of self-sufficiency, was dependent on an international flow of trade as much as any other nation by the mid-twentieth century. Whether it was importing luxury items from distant lands or exporting raw materials to European manufacturing hubs, the United States had a history of benefiting from international trade and in defending the notion of free markets.

War had brutally destroyed infrastructure, manufacturing capability, and all but obliterated the purchasing power of the European nations. Consequently, U.S. manufactured goods and raw materials lost a huge portion of the international market due to the war. The United States, as a nation relatively undamaged due to the destruction of war, had the opportunity to lend a hand. Many politicians felt that in doing so, the United States could rebuild Europe following the U.S. model of capitalism and democracy. Economic support for Europe was seen as vital in preventing a third war from developing. Additionally, the United States was convinced that Soviet influence and expansion needed to be halted at Europe’s borders. Unfortunately, as the U.S. public became more aware of the Soviet threat, their support moved from lending a hand to supporting military buildup. Simply put, investment in military muscle could protect the United States and its friends but did not require knowledge of economic theory. Buying foreign might have made sense to the economist, but exporting the United States in all its various forms made sense to the common U.S. citizen.

 

Endnotes

[1] Stephanie M. Amerian, “‘Buying European’: The Marshall Plan and American Department Stores,” Diplomatic History 39, no. 1 (January 2015): 45, (accessed March 14, 2015), http://dh.oxfordjournals.org/content/39/1/45.

 

Further Reading

Belmonte, Laura A. Selling the American Way: U.S. Propaganda and the Cold War. Philadelphia: University of Pennsylvania Press, 2010.

Boyce, Robert. The Great Interwar Crisis and the Collapse of Globalization. Reprint edition. Basingstoke: Palgrave Macmillan, 2012.

Hoganson, Kristin L. Consumers’ Imperium: The Global Production of American Domesticity, 1865-1920. 1 edition. Chapel Hill: The University of North Carolina Press, 2007.

Mariano, Marco. “Isolationism, Internationalism and the Monroe Doctrine.” Journal of Transatlantic Studies (Routledge) 9, no. 1 (Spring 2011): 35–45.

“Embargo of 1807.” Thomas Jerfferson’s Monticello. http://www.monticello.org/site/research-and-collections/embargo-1807.

 

Gowing to War: Purpose and a Plan

In continuation with last week’s post about the study of the motivations of war, I decided to revisit something I wrote a couple years ago.

The Spanish-American War and subsequent Philippine War were short wars by U.S. standards but had far reaching consequences. President McKinley’s “limited war strategy” was intended to gain independence for Cuba but its limited scope also included a limited understanding of the consequences of international conflict.[1] Simply put, the United States was unprepared for war. While the navy was somewhat prepared, the army struggled under continued state and congressional opposition to a strong peacetime military force.[2] As with the American Revolution and the Civil War, untrained volunteers, “who fancied they were soldiers because they could get across a level piece of ground without stepping on their own feet,” were mustered and sent to war with little opportunity for training.[3]

Lack of preparation was one of the issues faced during the “splendid little war.” Of greater issue was the lack of a clear objective for war. If independence was the objective, then it would have seemed logical for the United States to have had greater respect for the native rebels who had worn down the Spanish forces before the U.S. arrival. Rather than respecting and aiding the rebel effort, the United States went from liberator to conqueror and rejected the notion of revolution and self-governance. Rather, the United States implemented a paternalistic imperial rule over the former Spanish colonies. Although there would be efforts at nation building and promises of self-rule, economic and military dependency became the reality.

Whatever goals President McKinley might have had in justifying war, they seem to have gone with him to his grave.[4] While Cuba would achieve a semblance of independence once the war ended, the Philippines would find itself embroiled in further war and facing an arguably unwanted annexation. The United States would become an empire by default more than by plan. McKinley’s little war would also have unexpected, long-term consequences on U.S. military strategy.

The Spanish American War and the Philippine War which created a new empire, would encourage future generations to believe that a guerrilla opposition could be snuffed-out with enough oppression, pacification, and force. While McKinley had not recognized the nature and consequences of international war coupled with imperial occupation, later presidents would justify future international wars based on the perceived successes of these conflicts. Only after it was too late would they realize that occupying islands cut off from allies and supplies was and easier task than occupying lands connected to supply networks. In a time when photographic war journalism was in its infancy, and the atrocities of war could still be ignored by civilians in the United States, pacification policies, total suppression of civilians and combatants, and a torched earth policy could subdue an enemy without public outcry. The United States would learn eventually that people may cry for war when national interests are at risk, but they have little stomach for war or the devastation war brings when faced with the brutal reality of war. Former U.S. secretary of state and retired general Colin Powell once said, “War should be the politics of last resort. And when we go to war, we should have a purpose that our people understand and support.”[5] More importantly, a nation should only go to war when the president understands the clear purpose of the proposed war and has weighed the consequences, short-term and long-term, thoroughly.

Endnotes

[1] Allen R. Millett and Peter Maslowski, For the Common Defense: A Military History of the United States of America. rev exp. (New York: Free Press, 1994), 286.

[2] Ibid., 303.

[3] Ibid., 290.

[4] Brian McAllister Linn, The Philippine War, 1899-1902 (Lawrence, KS: University Press of Kansas, 2000), 3.

[5] Tim Russert, “Powell’s Doctrine, in Powell’s Words.” The Washington Post, October 7, 2001. http://www.mbc.edu/faculty/gbowen/Powell.htm (accessed September 11, 2012).

Home Front: A Culture of War

Patriotism and honor spurred the rise of War Fever during the First World War. Men signed up to fight and women proudly waved them good luck and goodbye. Yet as the war dragged on, War Fever waned. It was replaced by a culture of war which may not have eliminated the need for conscription but did place a stigma upon those who did not support the nation’s efforts to defeat the enemy. Censorship and propaganda aided in the development of a culture focused on patriotic valor, sacrifice, and mourning.

Governments felt a strong need to censor bad news. It took President Wilson less than a month after the declaration of war to create the Committee on Public Information (CPI) which was tasked with the censorship of telegraphs and radio stations. The CPI was not only tasked with restricting the negative news from reaching the public, but it was also tasked with promoting a war that seemed to have little direct impact on the United States as a whole. The United States was not alone in its need for the promotion of war when direct threat did not seem imminent. Great Britain had faced such a conundrum as well. Politicians and military leaders may have been convinced of the dangers the enemy posed to national interest and defense, but the common public had to be convinced. The United States faced an even greater challenge in convincing the public than the British had faced. Proximity to the enemy was not an arguable justification for the United States to go to war. Additionally, evidence of sabotage and espionage did not generate the kind of call to war that invasion and occupation did for nations like France. Propaganda needed to rely on inspiring the nation to develop a culture of war; not one based on immediate threat, but based upon the defense of ideology – the defense of liberty and democratic principles.

Fighting a war based upon the belief that the enemy did not seek territory but sought to destroy the virtuous fiber of the nation was accomplished by the demonizing of the enemy. In every war, demons would emerge, but it was not enough to demonize an individual or a military unit, the enemy’s society, as a whole, needed to be shown as demons seeking to exterminate all that a nation held dear. Over time, propaganda would help the public develop a hatred for an enemy who seemed to loath the virtues of liberty, freedom, and human dignity. In the name of these virtues, the home front embraced a culture of war even as they mourned the costs of war.

Diplomacy and Destiny

It has been said that war is politics by other means and few would disagree with the Clausewitzian sentiment, but one might also state that diplomacy is warfare by peaceful means. Often diplomacy seeks to gain without violence the same objectives that empires of old sought to gain through war. Relying upon Machiavellian precepts of being feared rather than loved, and by justifying the means by the end results, great diplomats have doggedly pursued national interests, sometimes believing destiny had already prescribed a greater future than present circumstances provided. One such diplomat was William Henry Seward (1801-1872). In 1853, seven years before becoming U.S. Secretary of State for the Lincoln administration, Senator Seward stated in a speech titled The Destiny of America, “Nevertheless it is not in man’s nature to be content with present attainment or enjoyment. You say to me, therefore, with excusable impatience, ‘Tell us not what our country is, but what she shall be. Shall her greatness increase? Is she immortal?’”[1] Steward believed the answer to these questions were the affirmative and would spend his career seeking to increase the greatness of the nation he served.

Like other expansionists, Seward would link U.S. commercial strength with the acquisition of foreign markets and territorial holdings. When Mexico and British Canada proved unfertile soil for acquisition, Seward looked elsewhere. Seward believed that the United States had a destiny to spread its notions of liberty to the new nations breaking free from European imperialism, particularly those liberating themselves from Spain. Unfortunately, he also believed, as many did, that shaking off imperial control did not necessarily mean the people of Latin America were prepared to self-govern.[2] Seward believed the southern neighbors would be better served if they became part of the United States. Seward achieved a piece of his goal by pushing for the purchase of Alaska, and while it was considered folly at the time, the discovery of gold changed how most viewed the acquisition. He had less success in his efforts to secure other territories in the Caribbean and Central America. However, he would be remembered for the tenacity with which he sought U.S. expansion; a tenacity that often diverged from diplomacy and bordered on bullying.[3] Those who were unfortunate to have sparred with Seward would have felt bombarded and under attack, and would have wondered at the fine line Seward drew between diplomacy and war. With a focus firmly on the destiny of U.S. greatness, Seward behaved more like a commanding general than a diplomat. Seward believed the destiny of the United States was not limited to contiguous land of North America, but that it reached far beyond. Eventually Steward’s tenacious diplomacy would be replaced by combat in a war that would acquire some of the territory Seward had desired. His vision of U.S. expansion, while not achieved during his time in office, did influence the direction of U.S. expansion as the nineteenth century drew to a close. Whether through diplomacy or warfare, men like Seward were determined to see the United States fulfill its destiny of greatness.

Endnotes

[1] Frederick Seward, Seward at Washington as Senator and Secretary of State: A Memoir of His Life, with Selections from His Letters, e-book (New York: Derby and Miller, 1891), 207.

[2] William Henry Seward, Life and Public Services of John Quincy Adams Sixth President of the United States with Eulogy Delivered before the Legislature of New York, e-book (Auburn, NY: Derby, Miller and Company, 1849), 122-123.

[3] George C. Herring, From Colony to Superpower: U.S. Foreign Relations Since 1776 (New York: Oxford University Press, 2008), 255-257.

Off the Battlefield and Socks

An old photo depicting men and women knitting socks flashes before my mind’s eye. Young and old, men and women, the wounded. Knitting socks was a way to support the troops of World War I. Today a trip to Walmart can easily supply a package of cotton socks. Wool socks, sturdy and durable might take a bit more searching to find, but a visit to a good sporting goods store, especially one selling skiing supplies, will do the trick. The days when proper foot care required handmade socks are long gone, and with the passage of time the memory of the dedicated service provided by the sock makers has faded. It is estimated that sixty-five million men were mobilized to fight in WWI, and each soldier would have needed socks as he went to war, and then more socks to replace the ones worn out from long marches or damp trenches. On the home front, knitting campaigns called people to action. Idle hands at home meant soldiers on the battlefield would suffer.

The technological advancements of the early 1900s did not eliminate the need for handmade socks, and as the world entered a second war, the patriotic call again went out for more socks. However, technology had made war so much more destructive. The bombing campaigns of WWII left towns in rubble and displaced an estimated sixty million Europeans. When the war ended, the hardships of war did not. Basic essentials for survival were still in desperate need. The infrastructure destroyed by military campaigns had to be rebuilt before the suffering could end. Battlefields had to be cleared and communities reestablished. Unfortunately, the humanitarian efforts of busy hands and caring hearts ran into political roadblocks. Decimated nations could not process and deliver the goods effectively. A care package from a long-distant relative or a long-distance friend had an easier time getting through to a family in need than did the large scale aid from relief organizations.

By the end of the twentieth century, handmade socks were a novelty rather than a necessity, and nations had learned valuable lessons about both the effects of war on and off the battlefield, and the need for post-war recovery efforts to eliminate humanitarian crises once war had ceased. As the century ended, the severity of war had not necessarily diminished, but the percentage of the population directly affected by war had. War still displaced, disrupted, and decimated local populations, but seldom reached the distant homelands of the foreign nations providing military support for weak governments. Therefore, the patriotic call to serve those who sacrificed and suffered in the name of liberty, freedom, or national interest was easily drowned out by the pleasurable distractions of life in a homeland untouched by war. By the end of the twentieth century, war, much like homemade socks, was a novelty rather than a reality – something other people might do, but not something that had a place in the modern, fast-paced, safer world many were sure the new century would bring.

Intervention: Ideology Versus National Interest

It has been twenty-four years since the First Gulf War[1]; a short war which might better fall under the categorization of international intervention into a conflicted region rather than as a truly international war.[2] Military interventions were not uncommon during the twentieth century, but the First Gulf War was unique in that it found support from parties who, only mere months and years prior, had been locked in the seemingly endless power struggle known as the Cold War. The international community, appalled at the blatant disregard for the national sovereignty of Kuwait, rallied support for military intervention when other means of international pressure failed to stop the ongoing invasion. As 1990 drew to a close, the debates raged in Washington, D.C. and elsewhere as to the justifications for and against intervention. At the very heart of the debate was the question of whether the international outrage over Iraq’s aggression was due to the economic national interests of oil consuming nations or if the ideology of international cooperation and peacekeeping was the justification for intervening in a conflict between two parties.

World War II demonstrated that it is unwise to overlook a hostile nation’s disregard for the national sovereignty of its neighbors. Yet even as clear as the lessons of WWII were to the international community, going to war to protect another nation’s sovereignty was not an easy choice. The argument was made that the protection of oil resources was the reason for a call to action rather than the ideological desire to defend a nation’s right to go unmolested by its neighbor. Oil, despite all other justifications for intervention, was at the center of the First Gulf War. It had been the catalyst for the invasion of Kuwait and it was undeniably of great economic national interest to many of the nations that rallied to Kuwait’s defense. It would be foolish to argue that oil wasn’t the issue at the heart of the war, but it would be incorrect to argue that it was the only issue. With the end of the Cold War, international focus had turned to the increased promotion of cooperation among nations, and to a greater support of international law. Sanctions were seen as a better option than military action in most cases. Whether or not all other non-violent means had been exhausted, it was decided that a military action was needed in order to enforce international law and protect international interests.

Not all hostile violations of sovereignty have received the attention the invasion of Kuwait did, and the reasons for the lack of international intervention are seldom debated with the vigor which was seen in 1990. The First Gulf War is one of the few examples of where a shared economic national interest and the ideology of international cooperation stood together to provide the justification needed for intervention.

Endnotes

[1] The Gulf War (2 August 1990 – 28 February 1991)

[2] Operation Desert Storm (17 January 1991 – 28 February 1991)

Cuba and the United States

I have long found the US/Cuba situation fascinating particularly in light of the fact that many nineteenth and early twentieth century U.S. politicians and businessmen had the wish of annexing Cuba, or at least keeping Cuba a friendly U.S. playground. Cuba, so close to the United States, was often a hoped for prize. Many power brokers in the United States felt sure Cuba would eventually choose to join its neighbor to the north. The fact that it never did but instead rejected the United States during the Cold War makes it all the more interesting and begs the question of why it choose such a different path from the one hoped for by men like Theodore Roosevelt, President McKinley, and many others.

In 2002, historian Louis A. Pérez, Jr. wrote an article for the Journal of Latin American Studies titled “Fear and Loathing of Fidel Castro: Sources of US Policy toward Cuba.” The following is a short paper I wrote after reading this and other articles discussing theories as to why the United States persisted with Cold War policies towards Cuba even after the end of the Cold War.

Loathsome Rejection: Cuba and the United States

Masked behind a cloud of Cold War fear, Cuba’s rejection of the United States was the loathsome reality of a failed U.S. attempt at imperial influence and a direct blow at the very heart of the Monroe Doctrine. Fidel Castro was “inalterably held responsible” and according to Louis A. Pérez Jr. in “Fear and Loathing of Fidel Castro: Sources of US Policy Toward Cuba,” Castro became a problem that would blind policy makers for over forty years, even after the end of the Cold War.[1]

“Castro was transformed simultaneously in to an anathema and phantasm, unscrupulous and perhaps unbalanced, possessed by demon and given to evil doings a wicked man with whom honourable men could not treat.”[2]

Pérez stated that the “initial instrumental rationale” for U.S. policy with Cuba, particularly the policy of sanctions, may have become “lost” over time, but that it was initially created under the precepts of containment.[3] However, in the case of Cuba, the practice of utilizing economic pressure through embargoes was undermined by the Cuban Adjustment Act of 1966 which allowed political asylum to any Cubans who made it to U.S. shores. This act became a release valve for the pressures created by the embargoes. While poor Cubans remained poor, the middle-class Cubans, who were most affected by U.S. sanctions, could attempt to seek refuge elsewhere. “The Logic of the policy required containing Cuban discontent inside Cuba,” but this logic was lost amid the emotional reaction the United States had towards Fidel Castro and his rejection of the United States. This rejection was compounded by the challenge to “the plausibility of the Monroe Doctrine,” and the United States “primacy in the western hemisphere.”[4] If rejection was not enough to engender such resentment, inviting the Soviet Union to become a military as well as an economic ally was more than U.S. policy makers could stand without seeking retribution.

Cold War fear and rhetoric does not sufficiently account for the continued and virulent animosity between the United States and Cuba, and Pérez was not the only scholar to take note. As the Soviet system crumbled and the Cold War came to an end, “the antagonism displayed by the U.S. government toward Cuba and Castro …intensified.”[5] The continued containment of Cuba in the post-Cold War era negated decades of U.S. assertions that the Cuban policy was the direct result of its status as a Soviet satellite. While others would write about the illogical continuation of Cold War policy, Pérez argued that U.S. policy toward Cuba had less to do with Cold War fear and containment, and more to do with loathing and retaliation for the rejection of the United States and the embarrassment such a rejection caused.

Certainly there was a real national threat in having Soviet missiles located so close to U.S. shores, but that threat does not account for U.S. policy before and after the missiles. Wayne S. Smith, who was stationed in Cuba as a vice-consul during the Cuban Revolution, claimed that Castro and his revolutionaries were not communist threats in 1956.

“We found no credible evidence to indicate Castro had links to the Communist party or even had much sympathy for it. Even so, he gave cause for concern, for he seemed to have gargantuan ambitions, authoritarian tendencies, and not much in the way of an ideology of his own. He was also fiercely nationalistic. Given the history of U.S. military occupations, the Platt amendment, and the outsized U.S. economic presence in Cuba, he did not hold the U.S. in high regard.”[6]

Without a doubt, the United States needed to address the threat posed by Castro, but to bypass speaking softly and instead proceeding to the wielding of a big stick was a move that would ensure crisis rather than avoiding crisis, especially when the Soviet Union was more than happy to lend Cuba a hand. The Soviet’s willing assistance, especially after the embarrassment of the Bay of Pigs, was all the justification needed for President Kennedy to pick the moment of crisis rather than giving Nikita Khrushchev the opportunity.[7]

Pérez does not argue against the notion that there was a real threat posed by Cuba, but instead he points out that the United States was handed a “trauma” when the U.S. playground turned into a war zone, and then into a dangerous Cold War threat.[8] This trauma affected the U.S. ability to rationally create and implement a policy that would stabilize relationships and reduce threat. “Dispassionate policy discourse on Cuba … was impossible” [9] as long as Castro remained Cuba’s leader, because he was “a breathing, living reminder of the limits of U.S. power.”[10]

Endnotes

[1] Louis A. Pérez, Jr. “Fear and Loathing of Fidel Castro: Sources of US Policy toward Cuba,” Journal of Latin American Studies 34, no. 2 (May 1, 2002): 227, http://www.jstor.org/stable/3875788 (accessed February 20, 2013).

[2] Ibid. 250.

[3] Ibid., 228.

[4] Ibid., 233.

[5] David Bernell, “The Curious Case of Cuba in American Foreign Policy,” Journal of Interamerican Studies and World Affairs 36, no. 2 (July 1, 1994): 66, http://www.jstor.org/stable/166174 (accessed February 19, 2013).

[6] Wayne S. Smith, The Closest of Enemies: A Personal and Diplomatic Account of U.S.-Cuban Relations Since 1957 (New York: W. W. Norton and Company, 1987), 15-16.

[7] Philip Zelikow, “American Policy and Cuba, 1961-1963.” Diplomatic History 24, no. 2 (Spring 2000): 325. http://web.ebscohost.com.ezproxy1.apus.edu/ehost/detail?sid=39889c50-22ab-48a2-b2e4-cd8946fd73a9%40sessionmgr15&vid=1&hid=18&bdata=JnNpdGU9ZWhvc3QtbGl2ZQ%3d%3d#db=aph&AN=2954415 (accessed February 19, 2013).

[8] Pérez, 231.

[9] Ibid., 250.

[10] Ibid., 251.

Other Readings

Dominguez, Jorge I. “U.S.-Cuban relations: From the Cold War to the colder war.” Journal of Interamerican Studies and World Affairs 39, no. 3 (Fall 1997): 49–75. http://search.proquest.com.ezproxy2.apus.edu/docview/200219310/13BF83A38607C999D8F/7?accountid=8289 (accessed January 31, 2013).

Herring, George C. From Colony to Superpower: U.S. Foreign Relations Since 1776. New York: Oxford University Press, 2008.

Paterson, Thomas G. “U.S. intervention in Cuba, 1898: Interpreting the Spanish-American-Cuban-Filipino war.” Magazine of History 12, no. 3 (Spring 1998): 5. http://search.proquest.com.ezproxy2.apus.edu/docview/213739998/13BF824CD53256D7D45/11?accountid=8289 (accessed January 31, 2013).

Williams, William Appleman. The Tragedy of American Diplomacy. 1972 New Edition. New York: W. W. Norton and Company, 1988.