Humanity on the Battlefield

There is a popular story that goes around at Christmas time about soldiers all along the Western Front calling a truce and singing Silent Night on Christmas Eve. What is often left out of the story is the anger this show of humanity caused in the higher leadership. During war, a reminder that the enemy is not the monster which propaganda depicts can interfere with morale, and with a soldier’s determination to win at all costs. Yet on that Christmas Eve, men on opposing sides of a futile war remembered that only politics separated them. Christmas marked the fifth month of war and the third month in the trenches. World War I was still in its early days and there was still hope for victory and for the short war the generals and politicians on both sides had promised the soldiers. The peace which was hoped for on Christmas Eve 1914 would not be found until Christmas time 1918. The brutality of the war and the anger of generals would squelch attempts to repeat what had sprung up so naturally along the Western Front in 1914. However, the legend of the first Christmas of WWI would remind generations that in war humanity can survive.

Paranoia and Insecurity: A Lesson from WWII

“On a morning in December 1941, a small nation which the United States had sought to contain and squeeze into submission through economic and diplomatic pressure, attacked with crippling force a naval base belonging to one of the largest nations of the world. Japan’s aerial attack on Pearl Harbor shook the United States and its sense of security.”[1] In the movie 1941, director Steven Spielberg created a comical portrayal of a population driven to protect their coastline from Japanese attack. In Spielberg’s outlandish film the insecurity caused by the attack on Pearl Harbor fed paranoia and panic and resulted in chaos. The movie was a comical spoof on the real paranoia that existed during the World War II, a paranoia which allowed a nation to justify its own attack on liberty.

On February 23, 1942, a Japanese sub entered the coastal waters near Santa Barbara, California and launched a bombardment on an oil field in Ellwood. Just days before the attack, President Roosevelt had created Executive Order 9066 which authorized the creation of policies that would lead to the internment of U.S. citizens. Coupled with propaganda films portraying the enemy as barbarians and animalistic, the events of late 1941 and early 1942 created an insecurity within the population that seemed to justify the civil rights violations that would follow.

Terror is an effective tool in a war and can have a much greater effect on a population than that of physical attack. An enemy will try to strike fear into the hearts and minds of its opponent with the hope that terror will weaken it. Modern technology made it possible for fear to be rapidly spread through media, and media played a vital role in spreading propaganda messages during World War II. The U.S. government worked hard to control propaganda, both the enemy’s and its own, but public fear was used as a tool to garner support as well. Justifiable actions of a nation at war, actions which deliberately heightened public fear and restricted civil liberty, seem less justifiable when the war ends but the insecurity remains. After World War II ended, the fear generated by the physical attacks on the nation diminished, but the fear created by the pervasive use of propaganda during the war remained imbedded in the public psyche. History seems to indicate that nations can quickly recover from the physical challenges of war, but the psychological challenges which are often heightened by the use of politically motivated propaganda take much longer to repair. Long after the physical attack becomes just a memory, paranoia and insecurity can linger continuing to justify the restriction of liberty.

End Notes:

[1] Jessie A. Hagen, “U.S. Insecurity in the Twentieth Century: How the Pursuit of National Defense Ingrained a State of National Insecurity,” American Military University, 2014.

Additional Reading:

Conley, Cornlius W. “The Great Japanese Balloon Offensive.” Air University Review XIX, no. 2 (February 1968): 68–83. http://www.airpower.maxwell.af.mil/airchronicles/aureview/1968/jan-feb/conley.html.

Dower, John W. War Without Mercy: Race and Power in the Pacific War. New York: Pantheon, 1986.

Roosevelt, Franklin D. “Executive Order 9066 – Authorizing the Secretary of War to Prescribe Military Areas,” February 19, 1942. Papers of Franklin Roosevelt. The American Presidency Project. http://www.presidency.ucsb.edu/ws/index.php?pid=61698.

———. “Fireside Chat, December 9, 1941,” December 9, 1941. http://www.presidency.ucsb.edu/ws/index.php?pid=16056.

———. “Fireside Chat, February 23, 1942,” February 23, 1942. http://www.presidency.ucsb.edu/ws/index.php?pid=16224.

“Civil Rights.” PBS: The War. Last modified 2007. http://www.pbs.org/thewar/at_home_civil_rights_japanese_american.htm.

“George Takei Describes His Experience in a Japanese Internment Camp.” io9. http://io9.com/george-takei-describes-his-experience-in-a-japanese-int-1533358984.

National Security: the Value of Nutrition and Education

In the years leading up to World War I, many progressive thinkers began to campaign for social reform. The industrial revolution changed society in many ways, not all of which were good for the nation or for national security. Unskilled labor and skilled labor alike were susceptible to the ills of urban life. Just as the war in Europe was igniting, one group of progressive reformers was introducing home economics textbooks and coursework into schools. Proper hygiene and good nutrition began to be taught alongside other subjects. Malnutrition and disease were viewed as ills which not only weakened society but undermined national well-being. The reformers who pushed for better living conditions and education for urban families gained a powerful ally when the United States entered WWI. The ally was the U.S. Army. When faced with a modern war, modern both in weaponry and technologically, the U.S. Army quickly discovered that it was no longer beneficial to go to war with illiterate soldiers. Modern war demanded healthy soldiers and demanded that the soldiers could communicate efficiently with each other. Basic health and literacy became a necessity for the modern army. The ground gained in understanding this truth was not easily won. The soldiers who fought in the war learned firsthand the value of both a healthy body and the ability to communicate with their fellow soldiers. Having a common language coupled with the ability to read and write in it would be something the returning soldiers would seek for their own children. These veterans would push for change. By the end of World War II the realities of modern war mandated the necessity of having a nation populated with citizens possessing basic health and education. Education and proper nutrition became a matter of national security.

Additional Reading:

  • Keene, Jennifer D. Doughboys, the Great War and the Remaking of America. Baltimore: The Johns Hopkins University Press, 2001.
  • National Security Act of 1947, Title X.
  • There were various publications designed to introduce Home Economics in the schools. Some have been scanned and can be found in different e-book collections. Original copies can be found through used bookstores. My favorites were authored by Helen Kinne and Anna M. Cooley.

No Man’s Land

A term older than World War I but popularized during that war, no man’s land refers to a stretch of land under dispute by warring parties, but it can also refer to lawless areas with little or no governing control. A buffer zone, on the other hand, is an area which provides a sense of protection from the enemy. When physical fortifications offer little protection, buffer zones can provide a perception of security. Nations great and small seek the perception of security when security is elusive. Treaties and alliances are traditional means of creating a sense of security, as is the creation of buffer zones. During the Cold War, the competing nations sought to expand their spheres of influence, thereby creating buffer zones between themselves and their enemies as their spheres grew. When the Cold War ended and the buffer zones were no longer needed, many of the buffer nations found themselves with fewer friends and with fewer resources to prevent lawlessness. These nations found it difficult to avoid the development of no man’s land within their borders.

The United States reasoned, even in the earliest days, that oceans made excellent buffer zones against the conflicts of Europe. Unsettled territories were adequate as buffers but only to a point. While unsettled territories didn’t pose a direct European threat, they were still loosely under the influence of powerful countries. Additionally they often attracted outlaws fleeing justice and smugglers seeking a base of operation near their markets. In 1818, Andrew Jackson decided to pursue a group of raiders into Florida. The problem was that Florida was owned by Spain and Spain had little ability to prevent lawlessness in the territory. When Jackson’s army crossed into Florida, he invaded a foreign nation. Without the consent of Spain, such an action created an international incident. Fortunately Secretary of State John Q. Adams was able to capitalize on Jackson’s actions, and convinced Spain that a treaty was better than a war. His reasoning for defending Jackson’s violation of Spanish sovereignty was that “it is better to err on the side of vigor.”[1] Certainly not the first time a nation chose a declaration of strength as its response to an international crisis of its own making, but possibly the first time such a response became national policy. As Secretary of State, Adams greatly influenced the foreign policy decisions of the president and authored much of what President Monroe presented to Congress. In March 1818, President Monroe declared to Congress that when a nation no longer governed in such a way as to prevent their lawlessness from spilling onto their neighbors, then the neighbors had the right to protect themselves and to seek justice even if it meant violating the sovereignty of another nation.[2] In other words, when an area became no man’s land, it was to the benefit of all nations for the lawlessness to be eliminated by whoever had the strength and will to do so.

Eliminating no man’s land in North America was a task that occupied the United States for more than a century. Eventually, the United States would reach from ocean to ocean and would gain the military might of a great nation. However even as the twentieth century dawned, the United States struggled to bring law to all of its territory. During the century of expansion, some in the United States saw potential in the acquisition of territory in the south, particularly in Central America. Others recognized the difficulty of governing such a vast nation. Faced with lawlessness due to revolt in Mexico during World War I, Wilson authorized the U.S. Army’s invasion of Mexico. However, Wilson recognized the value of having a buffer zone south of the border and eventually withdrew the army. In order to ensure that the southern nations created a friendly buffer zone, the United States supported governments that kept the peace, even though keeping the peace came at the expense of basic human rights. Like many leaders before and since, President Wilson put aside ideology and accepted peace-by-force as being better than lawlessness.

Reflecting on history, some leaders have sought security by building huge empires, some by establishing buffer zones, and others by the targeted elimination of no man’s land. Regardless of the method men and nations have chosen, it is clear that international law, notions of liberty and self-determination, and hope for world peace are always secondary to the goal of eliminating the threat posed by no man’s land.

Endnotes:

[1] Samuel Flagg Bemis, John Quincy Adams and the Foundations of American Foreign Policy (New York: Alfred A. Knopf, 1956), 315-316.

[2] James Monroe, “Spain and the Seminole Indians,” American Memory, Library of Congress, (March 25, 1818),  http://memory.loc.gov/cgi-bin/ampage?collId=llsp&fileName=004/llsp004.db&Page=183.

At the End: the Cold War

Twenty-five years ago the Berlin Wall was opened. Unplanned and unauthorized by the powers who controlled the border between east and west, the opening of the Wall signified an end of the Cold War and the beginning of a new era. While the momentous nature of act of opening a gate and letting people pass from east to west gained much attention at the time, other factors had been at play that would pave the way to peace and solidify the end of the Cold War in ways which went relatively unnoticed by the general public. Much has been written on the subject but not by authors with huge public followings. In honor of the twenty-fifth anniversary, we should look back. The following is a short essay* on one aspect of the end of the Cold War – just enough to pique your interest.

Historian John Lewis Gaddis has written that “the Cold War itself was a kind of theater in which distinctions between illusions and reality were not always obvious.”[1] It was fitting then that thespians took the stage for the final act. While is it common knowledge that President Ronald Reagan graced the silver screen in his younger days, it is less well known that other important actors of the final act had theatrical experience prior to their Cold War roles. Mikhail Gorbachev had been an “aspiring actor,” [2] in his youth, and the influential Pope John Paul II “had been an actor before he became a priest.”[3]The success of such actors on the Cold War stage was not due simply to their arrival upon the stage, but due in great part to the stage setting in which they inherited.

As with the origins of the Cold War, the end of the Cold War is not precise. Unlike hot wars which tend to end with the signing of peace treaties and have a clear chain of events preceding peace settlements, the end of the Cold War is ambiguous.  As historian George C. Herring pointed out, there is a myth that Reagan’s strong posturing and rhetoric are the direct cause of Soviet defeat.[4] Yet, to ascribe to such a myth negates the important role of the other actors and for those who set the stage on which the thespian preformed. Gaddis wrote, “it took visionaries – saboteurs of the status quo – to widen the range of historical possibility.”[5] More importantly, it took actors well versed in the art of improvisation, actors who could recognize the changing dynamics of the Cold War and grasp the opportunities of change. While there are numerous scenes to the last act of the Cold War, three key roles were played by each actor.

First, after the Able Archer exercises, President Reagan “drew the obvious – but for Cold War adversaries often elusive – conclusion that the Soviets feared the United States as much as American feared them.”[6] This shift led Reagan to adjust his strategy. While on one hand he ratcheted up the rhetoric, on the other he became more amiable to negotiations because he knew the United States had the upper hand.

Second, Mikhail Gorbachev recognized that public language was not really the same as diplomatic language, and that politicians like Reagan were acting to an audience. While Reagan certainly had an image to keep and a role to play, Gorbachev had an equally, if not more crucial part to play. He had to convince his people that glasnost and perestroika were positive changes, and that negotiations with the West were not signs of weakness.

The third actor, Pope John Paul II, helped “expose disparities between what people believed and the systems under which the Cold War had obliged them to live.”[7]

The Pope’s visit to Poland revealed that the USSR’s satellite enjoyed no popular legitimacy: They were Puppet regimes hated by their subject population. But Pope John Paul went further. He demystified the power of those regimes. With his words, his presence, and his injunction not to feel afraid, the Pope was for a while the real government of Poland.[8]

It would be wrong to assert that the Cold War, even in its final decade, lacked any real cause for fear, but Pope John Paul II diffused the overwhelming and consuming fear that had dominated the public since the days of Stalin.

The three actors took the world stage and improvised rather than continued the Cold War script where the bipolar status quo was viewed as “more stable than multipolar systems.”[9] The final act of the Cold War commenced once these three actors realized that President Roosevelt had been correct that the fear itself was the only thing causing the fear, and that the political divide could be cracked and then normalized once the people stopped feeling the oppression of the fear. While Cold War theatrics occasionally resurfaced, particularly when Reagan gave his famous “tear down this wall” speech in 1987, they did not deter the movement towards normalization between the United States and the Soviet Union.[10] The real success of the final act in the Cold War play is that while tough talk and grand speeches still placated the public perception of strength, changes were occurring specifically within the Soviet Union.  The stage had been set by the policies of containment, “collapse of détente,” the inherent weaknesses of the Soviet system, and mutual overspending on deadly war machines, but the final act was the result of leader desiring a change in the status quo.

* Due to unexpected issues this week, I am recycling an old essay rather than creating something new to commemorate the anniversary of The Fall of the Berlin Wall.

Endnotes:

[1] John Lewis Gaddis, The Cold War: A New History (New York: The Penguin Press, 2005), 195.

[2] George C. Herring, From Colony to Superpower: U.S. Foreign Relations Since 1776 (New York: Oxford University Press, 2008), 894.

[3] Gaddis, 195.

[4] Herring, 894.

[5] Gaddis, 196.

[6] Herring, 896.

[7] Gaddis, 196.

[8] John O’Sullivan, “Warm Cold Warrior,” National Review 57, no. 7 (April 25, 2005): 38.

[9] Gaddis, 196.

[10] Herring, 898.

Man or Machine: War in the 20th Century

World War I changed many facets of warfare, particularly where technology was concerned. The automobile, chemical weapons, and tanks stand out, but also do the airplanes, submarines, and machine guns. Some of these weapons were developed during the war and some were simply advanced beyond their pre-war status. The prevalence of the machine in World War I marked a dramatic shift in how the wars of the twentieth century would differ from the wars in previous centuries. These new machines created mass casualty beyond anything that Europe had ever experienced.

By the end of World War II and the beginning of the Cold War, nations would wonder if machines could be used to replace the soldier, or at least reduce the cost of human life. President Truman considered nuclear technology as a means to reduce ally casualties. After two devastating world wars, the notion of technology carrying the brunt of the work was appealing. The second half of the twentieth century provided opportunities for the theory to be tested. Air power became a key component in strategic planning. Bombardments from the air, whether from aircraft or from missiles located many miles from the conflict zone, devastated communities. All indications seemed to point to a day when machines would replace boots on the ground. However machines did not rout the enemy regardless of the devastation they created. As the twenty-first century dawned, warfare seemed to depart from the oceans and grand battlefields where the machine dominated and instead entered the villages and city streets where man could maneuver more adeptly. Despite all the technological development to the machines of war, man with natural adaptability time-and-time-again remained supreme.

War Fever

The dawn of the twentieth century had all the conditions necessary for an outbreak of war fever. The great nations and empires of the previous centuries were struggling to hold onto their power and status. New nations were expanding and seeking empire status in the wake of shifting colonial control and changes in markets. Revolution and nationalistic zeal challenged and destabilized the status quo. War was an opportunity to demonstrate national strength and military superiority. It was viewed as both a means to hold onto power and a means to gain power. War fever spread simply because those with power did not wish to see it dwindle and those without it wanted to gain what they believed was being denied to them. Modern technology made war destructive beyond measure, but technology also helped spread the propaganda necessary to enflame populations and maintain war fever.

During the mid-1800s under the leadership of Otto von Bismarck, Germany became a unified nation and strived to become a great industrialized power. Yet while Bismarck sought unification and political strength, Kaiser Wilhelm II longed for military glory and a strong German state which could withstand a simultaneous attack by its neighbors. Leading Germany on a path of militarization, the Kaiser ignited the sparks of flame that would lead to war fever in 1914. In a recent article, Professor Holger Afflerbach wrote that war was both dreadful and “glorious” with soldiers being granted “high social prestige” especially in nations were a militarization movement had taken root.[1] Germany was not the only nation to experience war fever or to glorify the honor of soldiering. Prior to World War I, war had been limited in its scope, at least for the most part. Armies battled armies and the civilian population as a whole was relatively unaffected by combat. Total war was a concept few had experienced directly. Technology and ease in transportation had begun to change warfare during the 1800s, but it would take World War I to bring these changes to public light. So in August 1914, when the call to war was made, men signed up for what they hoped would be a quick, glorious war. The armies of Europe swelled as a patriotic fervor fanned the flames of war fever.

Strangely enough, despite entrenched combat and the undeniable horrors of modern warfare, war fever spread to the United States in 1917. In an age where media was sympathetic to the nation and national causes, little of the true nature of war made it into the far-off homes of U.S. citizens. An ocean away, the horror of war was overshadowed by patriotic notions of rallying around the flag and racing to the aid of allies. In a great crusade to defend democratic liberty, the United States promoted war fever in order to fill its military ranks. In doing so, it demonstrated the value of industrial might in world affairs, and propelled itself to great power status in a world where the traditional balance of power was shifting. Germany had disrupted the balance of power established under the Concert of Europe when it unified and became an industrial force. Fearing its closest neighbors, Germany industrialized and militarized its nation making it a rival, and a threat to European stability and status quo. The United States, on the other hand, feeling less threatened by its neighbors had dedicated its energies to industrialization. Germany recognized the danger the United States posed as a major industrial nation, but calculated that the weak U.S. military structure would hinder a U.S. response to European war. Their calculations were wrong. The United States surprised the world with a rapid response. Due in part to a propaganda campaign which not only ignited war fever but used modern technology to spread it quickly and widely throughout the U.S. population, the United States went from anti-war to pro-war overnight, albeit the actual preparations involved in having an army ready to fight took a bit longer to manage.

War fever was a contagion that benefited from the notion that a limited war produced little disruption to the home front and would grant the nation, and its warriors, prestigious accolades. While World War I would demonstrate the brutality of modern war and introduce to Europe the horror of total war in a modern age, the lessons would not be universally comprehended. Deprivations of war would be greater understood by the end of World War II when aerial bombardment turned the home front into the frontline. The realities of modern war should have eradicated war fever entirely; however, the threat of war fever returned as total war became part of history and the notion of limited war reemerged as a prominent strategy during the Cold War. Much like a viral contagion, war fever could rage for a period and then die down until once again conditions were right for its return.

[1] Holger Afflerbach, “The Soldiers Across Europe Who Were Excited About World War I,” The Conversation, August 4, 2014, (accessed October 24, 2014), http://theconversation.com/the-soldiers-across-europe-who-were-excited-about-world-war-i-29807.

Unexpected Consequensces: Embargoes in the Early 1800s

Thomas Jefferson’s Embargo Act of 1807 did not work out the way he had planned. The restricted flow of British goods entering the United States spurred the development of U.S. manufacturing and changed society. Even the Embargo Act’s replacement the Nonintercourse Act, which allowed for trade with nations other than Great Britain or France, did not halt the changes occurring within the United States. Interestingly, two changes in domestic life resulted because of the embargo and war. These changes were the increased investment into cotton textile manufacturing, and the development of iron, particularly the production of cast iron. Eventually these developments in U.S. manufacturing would provide cheaper textiles for the home, and promote changes in the production of food due to the proliferation of the cooking stove.

In the early 1790s Samuel Slater helped develop the first modern cotton mill in the United States, but it would be two decades later, during the time of embargoes and war, that Francis Cabot Lowell’s textile mill system, which incorporated into one location the production of cotton thread and the finished woven cotton fabric, was built. These changes reduced the cost of production and increased productivity thereby making cotton fabric more available to the average household. Prior to these changes in cotton textile manufacturing, cotton was considered a luxury fabric. Whereas flax could be grown easily and turned into linen by the skilled spinners and weavers in the United States, cotton fabric was typically imported prior to the development U.S. manufacturing in the early 1800s.

In addition to the availability of less expensive textiles, changes in cooking methods were occurring due to the new technology of the cooking stove. While the first modern cooking stoves began appearing in the mid-1700s, it wasn’t until the end of the century that the major flaws had been worked out. Yet the cooking stove remained outside the reach of the average home due to the high cost of cast iron in the United States. The embargoes and the War of 1812 highlighted the need for increased domestic production of iron, and by the 1820s iron production had spread through Pennsylvania with Pittsburg becoming known as the “smoky city.”[1]The greater availability and affordability of cast iron stoves changed the way food was prepared. Not only were the stoves safer than open fires, they allowed the cook a greater range in what they prepared. No longer limited to a stewpot or spit, a woman could prepare a larger variety of food for her family without the need of additional labor in the home.

Thomas Jefferson had opposed men like Alexander Hamilton who had promoted the development of U.S. manufacturing. The ills of industrialization were not unknown to both Jefferson and Hamilton, but while Jefferson preferred to believe that the agrarian lifestyle was superior to manufacturing and would better promote liberty, men like Hamilton understood that economic power would be required to protect that very same liberty. The production of raw materials alone would not be enough to propel the United States to greatness. Without economic greatness, liberty would always be threatened by the dangers of imperialism and war. So while Jefferson’s embargo was meant to pressure the European nations into respecting U.S. sovereignty, it acted as an affirmation that U.S. manufacturing was vital to U.S. survival. The embargoes also helped change domestic life in the United States. Textiles for bedding and clothing became more available and less expensive, and the entire method of cooking was transformed. The U.S. response to international conflict in the early 1800s resulted in the unexpected consequence of increased investment in manufacturing which in turn changed society by transforming life in the home.

[1] Anne Madarasz, “Tracing the Smoky City,” Western Pennsylvania History, 2002, (accessed October 18, 2014),  https://journals.psu.edu/wph/article/viewFile/5111/4894.

War Hawks

In the spring of 1811, a group of young men arrived in Washington, D.C. to fill congressional posts. Led by men like Henry Clay and John C. Calhoun, these new members of congress called for stronger measures in dealing with Europe and in dealing with the American frontier. War was viewed as the answer to problems unresolved by diplomacy or embargo. Europe, it seemed, placed little value on U.S. sovereignty. The wars of Europe threatened U.S. economic stability, and increased the British tendency to confiscate U.S. ships and impress U.S. sailors. On the western frontier, the native population was less friendly to the United States than they were to British Canada, thereby causing the increased worry in the United States that Great Britain might use the natives to further challenge the sovereignty of the United States. Additionally, Spain did little to curb the native raiding parties that caused havoc along the U.S. southern border. The War Hawks, as the new congressmen were called, believed that war was not only inevitable, but also the only practical solution.

Henry Clay stated, “…where is the motive for longer delay? … Our wrongs have been great; our cause is just; and if we are decided and firm, success is inevitable.” He continued with assurances that the United States was not only prepared but that Britain would not bother with another war in America. “The idea is too absurd to merit a moment’s consideration.”[1] By the end of 1814, the British had burned Washington and U.S. leaders in the northeast were discussing succession as a solution to the economic crisis plaguing their region. However it was not battle victory that ended the war which Clay had so eagerly sought, but diplomacy that ended it; ended it before the famous victory of Andrew Jackson at New Orleans. The United States had fought once more to establish its sovereignty but peace was not won by U.S. military prowess, rather once again the British grew weary of war with the Americans and a diplomatic solution was sought by both sides. Clay had been right to some extent when he said the British would not want to fight on American soil, not after such a long battle with Napoleon.

Europe was war weary and the United States, while not strong enough to defeat its armies and navies, was strong enough to make war an unappealing prospect. Additionally, all of Europe was ready to see an end to revolution and the international wars which had caused immeasurable strife for decades. The powers of Europe, under the new Concert of Europe, would go to great lengths to prevent war and would work to create a balance of power that would deter nations from seeking war when dealing with their neighbors. This dedication on their part would afford the United States the opportunity to grow as a nation both in size and strength without being entangled in or harmed by European war.

U.S. politicians dedicated to expansionist’s policies were well aware of the European frustrations with war. In the years following the end of the War of 1812, the United States would capitalize on Europe’s preoccupation with keeping peace at home. It would expand westward and southward. Great Britain, with its powerful navy still intact after decades of war, was the only nation which truly challenged the United States, and the British seemed content to focus on trade rather than colonization in the Americas.

While the U.S. managed to free itself from the machinations of Europe during the War of 1812, there may have been some unexpected consequences. The war may have increased the belief that a weaker nation could defeat a much greater military force simply by wearing down the enemies desire to fight. The United States would become an example that many would emulate in the future, not always to the benefit of the United States or its allies.

Additionally, the war promoted the notion that a just cause made for a successful war. Success was not inevitable as Henry Clay had stated, and in truth, success was anything but inevitable. Military theorist and Prussian general Carl Von Clausewitz wrote that war was a “game in which Time and Chance shuffled the cards; but in its signification it was only diplomacy somewhat intensified, a more vigorous way of negotiating, in which battles and sieges were substituted for diplomatic notes.”[2] In the case of the War of 1812, “time and chance” favored the United States. War may, in the end, be unavoidable, but success in war is never guaranteed regardless of the rhetoric and zeal of people best labeled as War Hawks.

[1] Henry Clay, “Letter in Support of the War of 1812,” 1812, (accessed October 16, 2014), http://teachingamericanhistory.org/library/document/letter-in-support-of-the-war-of-1812/.

[2] Carl von Clausewitz, On War, trans. Dominique Poirier and J. J. Graham, Kindle ed., 2010.

Intertwined: Codependency and War

During President Thomas Jefferson’s second term in office, the sovereignty of the United States was challenged by both Great Britain and France. Europe was at war and Napoleon was determined to maintain control over European ports and trade. In a move intended to inflict economic injury on Great Britain, Napoleon established the Continental System, a policy that allowed France to confiscate all British goods entering Europe. As a consequence, trade goods which had made port in British harbors were also considered contraband regardless of their origins or the flag under which they flew. In retaliation, Great Britain, with its much larger navy, ratcheted up its control of the seas and began confiscating trade goods, ships, and sailors on both the oceans and just off the coastal waters of the United States. In June 1807, the British warship HMS Leopard attacked the USS Chesapeake in U.S. coastal waters. Thomas Jefferson was faced with a clear act of war, and an attack on the very sovereignty the United States had fought so hard to establish three decades earlier. Unfortunately, he led a nation as unprepared for war as it was dependent on trade with Europe.

When the modern citizen thinks of war preparation, thoughts usually turn to war machines and armed forces. By the twentieth century, the industrial might of the United States made it easy for the U.S. citizen to forget the difficulties that preparing for war entailed in the early industrial days when Thomas Jefferson was president. There was not simply the need to build up the navy and the army, but also the difficulty of supplying all the materials a navy and an army would require. Many in the United States believed that Europe, especially industrial Great Britain, relied so heavily on U.S. raw materials that they would suffer if those raw materials were denied them. Jefferson was certain that by withholding U.S. raw materials and agricultural products from Great Britain and France that the warring nations would relent and allow the U.S. to remain outside Europe’s conflict. In December, he convinced Congress to pass the Embargo Act of 1807 which, in short, grounded the U.S. merchant fleet.

Jefferson’s hope that the embargo would force the British and French to respect U.S. sovereignty did not work as he had hoped. Rather it increased smuggling, created discord at home, compounded the economic hardships already faced in much of the United States due to a recession, and spurred the growth of U.S. industrialization and manufacturing. Jefferson was not a proponent of urban manufacturing and industrialization but a believer in the virtue of an agrarian lifestyle. He believed that the United States would become an “empire of liberty” if focus was placed on an agrarian lifestyle and the ills of industrialization were avoided.

Without the ability to export raw materials, U.S. entrepreneurs looked for ways to use them at home. A direct increase in U.S. manufacturing can be traced to this period of time. Of great importance was the development of U.S. textile manufacturing which was still in its early stages when Jefferson set out to embargo British and French trade goods. As strange as it might seem, the notion of supplying standardized and functional uniforms for the armed forces was still rather new. In 1732 Maurice de Saxe wrote of the importance of providing uniforms for the army in addition to food, shelter, and weaponry. While Jefferson believed Europe was dependent on U.S. raw materials, in reality the U.S. was more heavily dependent on Europe for manufactured goods. In order to become a great power like the powers of Europe and defend its sovereignty, the United States would need to industrialize. It would need to produce at home the materials essential for war which included the clothing worn by its armed forces.

Jefferson had hoped that the embargo would not only provide time for the young nation to bolster home defenses and prepare for war, but that it would prevent war altogether by forcing Europe to recognize its dependency on U.S. raw materials. However, the embargo demonstrated two invaluable lessons. First, economic strength was as vital as military might. An empire, even one of liberty, required the economic strength and diversity to withstand the challenges of war, even a war it wanted to avoid. Second, no nation was truly isolated enough to survive on its own. Isolation was a mythical ideal that economic codependency and the nature of war made unrealistic. The fates of the nations of the world were already irrevocably intertwined.