In 1918, Fritz Haber was awarded the Nobel Prize in Chemistry. World War I delayed the presentation of the award because Haber was a German scientist, one who had gained the name ‘the father of chemical warfare’. Haber was a patriotic German committed to the German cause, however, less than fifteen years after he was celebrated as a great scientist, he fled his homeland fearing for his life. Fritz Haber was a Jew. He was also an intellectual who too closely associated with a war that had been lost rather than won. Like many other German citizens, Haber discovered that under the right set of circumstances hate could replace friendship with great rapidity. Those circumstances included an economic recession, a turbulent political climate, an abundance persuasive rhetoric, and a highly effective propaganda campaign. In less than two decades, a population who once celebrated Haber’s achievements turned their backs on the evidence that their government had implemented a policy of incarceration and extermination. Race, religious affiliation, sexual orientation, and intellectual interests were more than enough justification for the public to look the other way, or worse join the Nazi agenda. Change came quickly while the public clung to the notion that they were justified in their actions.
Monthly Archives: March 2015
U.S. Compulsory Education: Teaching Exceptionalism
During the mid-nineteenth century, states began passing compulsory education laws, and although all states had these laws in place by the time the United States entered World War I, there was still quite a disparity between levels of basic education received by the soldiers. Mobilization efforts during WWI highlighted the need for greater emphasis on education in the United States, but it also highlighted the need to emphasize a common nationality among its citizenry. The war had created a stigma on citizens and immigrants who were too closely related or associated with the enemy. It was felt that the ‘old country’ culture, still held by many, needed to be replaced by a commitment to a less definable, but more patriotic American culture. The desire to eliminate overt connections with European culture, a culture that seemed to instigate war rather than peace, led to strong measures designed to force change in the U.S. population. One measure included the effort to eliminate parochial schools which were viewed as being too closely tied to European culture. When Oregon amended its compulsory education laws in 1922 with the intent to eliminate parochial schools, they faced opposition including a Supreme Court case that ended up ruling against them. It was hoped that public education would transform the population into a more cohesive culture, and while states couldn’t force public school attendance versus private school attendance, over time many states were able to dictate curriculum requirements and achieve the underlying goals sought by legislators during the post-war period.
Many in the United States believed that the nation had a vital responsibility to encourage and spread notions of republican democracy. A growing belief in ‘American exceptionalism’ developed in the post-war years, due in part to wartime propaganda. If the United States was to be exceptional then it needed to guarantee that its public understood what made it exceptional. Accomplishing this task meant that its citizenry needed to understand history, and not just the history of the United States beginning with colonization or independence, but a citizen needed to understand the connection between the United States and ancient history where the foundations of democracy resided. Compulsory education, classes in American History and Western Civilization, and an emphasis on U.S. exceptionalism became the foundation for unifying a nation during the twentieth century.
When Buying Foreign Was in the U.S. National Interest
Historian Stephanie M. Amerian recently published an excellent article about the Marshall Plan and the U.S. government’s promotion of “buying European” in the years following the end of World War II.[1] It was of vital national interest for the citizens of the United States to spend money on European goods, to travel to European destinations, and to support the members of the European community of nations. If the U.S. didn’t spend its currency in Europe and on European manufactured goods, then a devastated Europe would not be able to purchase U.S. raw materials and finished goods.
Protectionism and isolationism had not been successful economic or political policies during Thomas Jefferson’s day when, as president, he supported an embargo as the means to pressure Great Britain. Nor had such policies been successful in combating the effects of recession, great or small, in the years between the Jefferson administration and WWII. The United States, while large and possessing a high level of self-sufficiency, was dependent on an international flow of trade as much as any other nation by the mid-twentieth century. Whether it was importing luxury items from distant lands or exporting raw materials to European manufacturing hubs, the United States had a history of benefiting from international trade and in defending the notion of free markets.
War had brutally destroyed infrastructure, manufacturing capability, and all but obliterated the purchasing power of the European nations. Consequently, U.S. manufactured goods and raw materials lost a huge portion of the international market due to the war. The United States, as a nation relatively undamaged due to the destruction of war, had the opportunity to lend a hand. Many politicians felt that in doing so, the United States could rebuild Europe following the U.S. model of capitalism and democracy. Economic support for Europe was seen as vital in preventing a third war from developing. Additionally, the United States was convinced that Soviet influence and expansion needed to be halted at Europe’s borders. Unfortunately, as the U.S. public became more aware of the Soviet threat, their support moved from lending a hand to supporting military buildup. Simply put, investment in military muscle could protect the United States and its friends but did not require knowledge of economic theory. Buying foreign might have made sense to the economist, but exporting the United States in all its various forms made sense to the common U.S. citizen.
Endnotes
[1] Stephanie M. Amerian, “‘Buying European’: The Marshall Plan and American Department Stores,” Diplomatic History 39, no. 1 (January 2015): 45, (accessed March 14, 2015), http://dh.oxfordjournals.org/content/39/1/45.
Further Reading
Belmonte, Laura A. Selling the American Way: U.S. Propaganda and the Cold War. Philadelphia: University of Pennsylvania Press, 2010.
Boyce, Robert. The Great Interwar Crisis and the Collapse of Globalization. Reprint edition. Basingstoke: Palgrave Macmillan, 2012.
Hoganson, Kristin L. Consumers’ Imperium: The Global Production of American Domesticity, 1865-1920. 1 edition. Chapel Hill: The University of North Carolina Press, 2007.
Mariano, Marco. “Isolationism, Internationalism and the Monroe Doctrine.” Journal of Transatlantic Studies (Routledge) 9, no. 1 (Spring 2011): 35–45.
“Embargo of 1807.” Thomas Jerfferson’s Monticello. http://www.monticello.org/site/research-and-collections/embargo-1807.
Gowing to War: Purpose and a Plan
In continuation with last week’s post about the study of the motivations of war, I decided to revisit something I wrote a couple years ago.
The Spanish-American War and subsequent Philippine War were short wars by U.S. standards but had far reaching consequences. President McKinley’s “limited war strategy” was intended to gain independence for Cuba but its limited scope also included a limited understanding of the consequences of international conflict.[1] Simply put, the United States was unprepared for war. While the navy was somewhat prepared, the army struggled under continued state and congressional opposition to a strong peacetime military force.[2] As with the American Revolution and the Civil War, untrained volunteers, “who fancied they were soldiers because they could get across a level piece of ground without stepping on their own feet,” were mustered and sent to war with little opportunity for training.[3]
Lack of preparation was one of the issues faced during the “splendid little war.” Of greater issue was the lack of a clear objective for war. If independence was the objective, then it would have seemed logical for the United States to have had greater respect for the native rebels who had worn down the Spanish forces before the U.S. arrival. Rather than respecting and aiding the rebel effort, the United States went from liberator to conqueror and rejected the notion of revolution and self-governance. Rather, the United States implemented a paternalistic imperial rule over the former Spanish colonies. Although there would be efforts at nation building and promises of self-rule, economic and military dependency became the reality.
Whatever goals President McKinley might have had in justifying war, they seem to have gone with him to his grave.[4] While Cuba would achieve a semblance of independence once the war ended, the Philippines would find itself embroiled in further war and facing an arguably unwanted annexation. The United States would become an empire by default more than by plan. McKinley’s little war would also have unexpected, long-term consequences on U.S. military strategy.
The Spanish American War and the Philippine War which created a new empire, would encourage future generations to believe that a guerrilla opposition could be snuffed-out with enough oppression, pacification, and force. While McKinley had not recognized the nature and consequences of international war coupled with imperial occupation, later presidents would justify future international wars based on the perceived successes of these conflicts. Only after it was too late would they realize that occupying islands cut off from allies and supplies was and easier task than occupying lands connected to supply networks. In a time when photographic war journalism was in its infancy, and the atrocities of war could still be ignored by civilians in the United States, pacification policies, total suppression of civilians and combatants, and a torched earth policy could subdue an enemy without public outcry. The United States would learn eventually that people may cry for war when national interests are at risk, but they have little stomach for war or the devastation war brings when faced with the brutal reality of war. Former U.S. secretary of state and retired general Colin Powell once said, “War should be the politics of last resort. And when we go to war, we should have a purpose that our people understand and support.”[5] More importantly, a nation should only go to war when the president understands the clear purpose of the proposed war and has weighed the consequences, short-term and long-term, thoroughly.
Endnotes
[1] Allen R. Millett and Peter Maslowski, For the Common Defense: A Military History of the United States of America. rev exp. (New York: Free Press, 1994), 286.
[2] Ibid., 303.
[3] Ibid., 290.
[4] Brian McAllister Linn, The Philippine War, 1899-1902 (Lawrence, KS: University Press of Kansas, 2000), 3.
[5] Tim Russert, “Powell’s Doctrine, in Powell’s Words.” The Washington Post, October 7, 2001. http://www.mbc.edu/faculty/gbowen/Powell.htm (accessed September 11, 2012).