Tag Archives: America

Off the Battlefield and Socks

An old photo depicting men and women knitting socks flashes before my mind’s eye. Young and old, men and women, the wounded. Knitting socks was a way to support the troops of World War I. Today a trip to Walmart can easily supply a package of cotton socks. Wool socks, sturdy and durable might take a bit more searching to find, but a visit to a good sporting goods store, especially one selling skiing supplies, will do the trick. The days when proper foot care required handmade socks are long gone, and with the passage of time the memory of the dedicated service provided by the sock makers has faded. It is estimated that sixty-five million men were mobilized to fight in WWI, and each soldier would have needed socks as he went to war, and then more socks to replace the ones worn out from long marches or damp trenches. On the home front, knitting campaigns called people to action. Idle hands at home meant soldiers on the battlefield would suffer.

The technological advancements of the early 1900s did not eliminate the need for handmade socks, and as the world entered a second war, the patriotic call again went out for more socks. However, technology had made war so much more destructive. The bombing campaigns of WWII left towns in rubble and displaced an estimated sixty million Europeans. When the war ended, the hardships of war did not. Basic essentials for survival were still in desperate need. The infrastructure destroyed by military campaigns had to be rebuilt before the suffering could end. Battlefields had to be cleared and communities reestablished. Unfortunately, the humanitarian efforts of busy hands and caring hearts ran into political roadblocks. Decimated nations could not process and deliver the goods effectively. A care package from a long-distant relative or a long-distance friend had an easier time getting through to a family in need than did the large scale aid from relief organizations.

By the end of the twentieth century, handmade socks were a novelty rather than a necessity, and nations had learned valuable lessons about both the effects of war on and off the battlefield, and the need for post-war recovery efforts to eliminate humanitarian crises once war had ceased. As the century ended, the severity of war had not necessarily diminished, but the percentage of the population directly affected by war had. War still displaced, disrupted, and decimated local populations, but seldom reached the distant homelands of the foreign nations providing military support for weak governments. Therefore, the patriotic call to serve those who sacrificed and suffered in the name of liberty, freedom, or national interest was easily drowned out by the pleasurable distractions of life in a homeland untouched by war. By the end of the twentieth century, war, much like homemade socks, was a novelty rather than a reality – something other people might do, but not something that had a place in the modern, fast-paced, safer world many were sure the new century would bring.

Victory in the End

Just as declarations of war seldom mark the moment conflict begins, peace treaties seldom mark the end. One of the most famous examples of a battle fought after a war had ended occurred 200 years ago during the Battle of New Orleans.[1] The war had been unpopular in the United States and victory on the battlefield scarce.[2] Pressure had been mounting to settle the war even without clear victory having been achieved. On December 24, 1814 a diplomatic contingent agreed to the Treaty of Ghent. The objectives for having gone to war had not been met, but the United States had proven to itself and the world that it could wage a war without the assistance of outside nations.

Four days after the treaty was signed, the Battle of New Orleans would commence. Under the command of Andrew Jackson, militia numbing around 4,700 faced off against 5,300 British army regulars who were supported by naval contingents. In the end, the British would suffer 2,400 casualties, the Americans only 70.[3] Having occurred after the peace negotiations had concluded, the Battle of New Orleans would have no effect on the end of the war but would have a lasting effect on the American psyche.[4] In 1959 Johnny Horton recorded the song Battle of New Orleans and the song reached number one on the charts. Albeit a humorous version of the battle, the song would reintroduce the public to a war often overlooked in U.S. history; a war that had solidified the independence hard won a generation prior. The Battle of New Orleans may not have ended the War of 1812, but it did end questions of U.S. independence, viability, and sovereignty.

The War of 1812 often gets overlooked, but the little war is the stuff of legends. From the burning of Washington, to the battle to save Baltimore, and finally to the Battle of New Orleans, the War of 1812 changed the United States. Diplomatically and militarily, the United States proved it could to fight and survive without the aid of Europe. In the end, it mattered little that the victory of New Orleans occurred after peace negotiations had technically ended the war. In the end, it mattered little that the war as a whole had been a stalemate. In the end, what mattered was that victory had been possible, and decisive victory on the battlefield had been achieved. Rag-tagged or not, a nation set upon survival and independence had not been defeated. In the end, that was victory.

Endnotes

[1] (December 28, 1814 – January 8, 1815)

[2] George C. Herring, From Colony to Superpower: U.S. Foreign Relations Since 1776 (New York: Oxford University Press, 2008), 128.

[3] John Whiteclay Chambers, ed. The Oxford Companion to American Military History (Oxford: Oxford University Press, 2000), 496-497.

[4] Herring, 132.

Cuba and the United States

I have long found the US/Cuba situation fascinating particularly in light of the fact that many nineteenth and early twentieth century U.S. politicians and businessmen had the wish of annexing Cuba, or at least keeping Cuba a friendly U.S. playground. Cuba, so close to the United States, was often a hoped for prize. Many power brokers in the United States felt sure Cuba would eventually choose to join its neighbor to the north. The fact that it never did but instead rejected the United States during the Cold War makes it all the more interesting and begs the question of why it choose such a different path from the one hoped for by men like Theodore Roosevelt, President McKinley, and many others.

In 2002, historian Louis A. Pérez, Jr. wrote an article for the Journal of Latin American Studies titled “Fear and Loathing of Fidel Castro: Sources of US Policy toward Cuba.” The following is a short paper I wrote after reading this and other articles discussing theories as to why the United States persisted with Cold War policies towards Cuba even after the end of the Cold War.

Loathsome Rejection: Cuba and the United States

Masked behind a cloud of Cold War fear, Cuba’s rejection of the United States was the loathsome reality of a failed U.S. attempt at imperial influence and a direct blow at the very heart of the Monroe Doctrine. Fidel Castro was “inalterably held responsible” and according to Louis A. Pérez Jr. in “Fear and Loathing of Fidel Castro: Sources of US Policy Toward Cuba,” Castro became a problem that would blind policy makers for over forty years, even after the end of the Cold War.[1]

“Castro was transformed simultaneously in to an anathema and phantasm, unscrupulous and perhaps unbalanced, possessed by demon and given to evil doings a wicked man with whom honourable men could not treat.”[2]

Pérez stated that the “initial instrumental rationale” for U.S. policy with Cuba, particularly the policy of sanctions, may have become “lost” over time, but that it was initially created under the precepts of containment.[3] However, in the case of Cuba, the practice of utilizing economic pressure through embargoes was undermined by the Cuban Adjustment Act of 1966 which allowed political asylum to any Cubans who made it to U.S. shores. This act became a release valve for the pressures created by the embargoes. While poor Cubans remained poor, the middle-class Cubans, who were most affected by U.S. sanctions, could attempt to seek refuge elsewhere. “The Logic of the policy required containing Cuban discontent inside Cuba,” but this logic was lost amid the emotional reaction the United States had towards Fidel Castro and his rejection of the United States. This rejection was compounded by the challenge to “the plausibility of the Monroe Doctrine,” and the United States “primacy in the western hemisphere.”[4] If rejection was not enough to engender such resentment, inviting the Soviet Union to become a military as well as an economic ally was more than U.S. policy makers could stand without seeking retribution.

Cold War fear and rhetoric does not sufficiently account for the continued and virulent animosity between the United States and Cuba, and Pérez was not the only scholar to take note. As the Soviet system crumbled and the Cold War came to an end, “the antagonism displayed by the U.S. government toward Cuba and Castro …intensified.”[5] The continued containment of Cuba in the post-Cold War era negated decades of U.S. assertions that the Cuban policy was the direct result of its status as a Soviet satellite. While others would write about the illogical continuation of Cold War policy, Pérez argued that U.S. policy toward Cuba had less to do with Cold War fear and containment, and more to do with loathing and retaliation for the rejection of the United States and the embarrassment such a rejection caused.

Certainly there was a real national threat in having Soviet missiles located so close to U.S. shores, but that threat does not account for U.S. policy before and after the missiles. Wayne S. Smith, who was stationed in Cuba as a vice-consul during the Cuban Revolution, claimed that Castro and his revolutionaries were not communist threats in 1956.

“We found no credible evidence to indicate Castro had links to the Communist party or even had much sympathy for it. Even so, he gave cause for concern, for he seemed to have gargantuan ambitions, authoritarian tendencies, and not much in the way of an ideology of his own. He was also fiercely nationalistic. Given the history of U.S. military occupations, the Platt amendment, and the outsized U.S. economic presence in Cuba, he did not hold the U.S. in high regard.”[6]

Without a doubt, the United States needed to address the threat posed by Castro, but to bypass speaking softly and instead proceeding to the wielding of a big stick was a move that would ensure crisis rather than avoiding crisis, especially when the Soviet Union was more than happy to lend Cuba a hand. The Soviet’s willing assistance, especially after the embarrassment of the Bay of Pigs, was all the justification needed for President Kennedy to pick the moment of crisis rather than giving Nikita Khrushchev the opportunity.[7]

Pérez does not argue against the notion that there was a real threat posed by Cuba, but instead he points out that the United States was handed a “trauma” when the U.S. playground turned into a war zone, and then into a dangerous Cold War threat.[8] This trauma affected the U.S. ability to rationally create and implement a policy that would stabilize relationships and reduce threat. “Dispassionate policy discourse on Cuba … was impossible” [9] as long as Castro remained Cuba’s leader, because he was “a breathing, living reminder of the limits of U.S. power.”[10]

Endnotes

[1] Louis A. Pérez, Jr. “Fear and Loathing of Fidel Castro: Sources of US Policy toward Cuba,” Journal of Latin American Studies 34, no. 2 (May 1, 2002): 227, http://www.jstor.org/stable/3875788 (accessed February 20, 2013).

[2] Ibid. 250.

[3] Ibid., 228.

[4] Ibid., 233.

[5] David Bernell, “The Curious Case of Cuba in American Foreign Policy,” Journal of Interamerican Studies and World Affairs 36, no. 2 (July 1, 1994): 66, http://www.jstor.org/stable/166174 (accessed February 19, 2013).

[6] Wayne S. Smith, The Closest of Enemies: A Personal and Diplomatic Account of U.S.-Cuban Relations Since 1957 (New York: W. W. Norton and Company, 1987), 15-16.

[7] Philip Zelikow, “American Policy and Cuba, 1961-1963.” Diplomatic History 24, no. 2 (Spring 2000): 325. http://web.ebscohost.com.ezproxy1.apus.edu/ehost/detail?sid=39889c50-22ab-48a2-b2e4-cd8946fd73a9%40sessionmgr15&vid=1&hid=18&bdata=JnNpdGU9ZWhvc3QtbGl2ZQ%3d%3d#db=aph&AN=2954415 (accessed February 19, 2013).

[8] Pérez, 231.

[9] Ibid., 250.

[10] Ibid., 251.

Other Readings

Dominguez, Jorge I. “U.S.-Cuban relations: From the Cold War to the colder war.” Journal of Interamerican Studies and World Affairs 39, no. 3 (Fall 1997): 49–75. http://search.proquest.com.ezproxy2.apus.edu/docview/200219310/13BF83A38607C999D8F/7?accountid=8289 (accessed January 31, 2013).

Herring, George C. From Colony to Superpower: U.S. Foreign Relations Since 1776. New York: Oxford University Press, 2008.

Paterson, Thomas G. “U.S. intervention in Cuba, 1898: Interpreting the Spanish-American-Cuban-Filipino war.” Magazine of History 12, no. 3 (Spring 1998): 5. http://search.proquest.com.ezproxy2.apus.edu/docview/213739998/13BF824CD53256D7D45/11?accountid=8289 (accessed January 31, 2013).

Williams, William Appleman. The Tragedy of American Diplomacy. 1972 New Edition. New York: W. W. Norton and Company, 1988.

Paranoia and Insecurity: A Lesson from WWII

“On a morning in December 1941, a small nation which the United States had sought to contain and squeeze into submission through economic and diplomatic pressure, attacked with crippling force a naval base belonging to one of the largest nations of the world. Japan’s aerial attack on Pearl Harbor shook the United States and its sense of security.”[1] In the movie 1941, director Steven Spielberg created a comical portrayal of a population driven to protect their coastline from Japanese attack. In Spielberg’s outlandish film the insecurity caused by the attack on Pearl Harbor fed paranoia and panic and resulted in chaos. The movie was a comical spoof on the real paranoia that existed during the World War II, a paranoia which allowed a nation to justify its own attack on liberty.

On February 23, 1942, a Japanese sub entered the coastal waters near Santa Barbara, California and launched a bombardment on an oil field in Ellwood. Just days before the attack, President Roosevelt had created Executive Order 9066 which authorized the creation of policies that would lead to the internment of U.S. citizens. Coupled with propaganda films portraying the enemy as barbarians and animalistic, the events of late 1941 and early 1942 created an insecurity within the population that seemed to justify the civil rights violations that would follow.

Terror is an effective tool in a war and can have a much greater effect on a population than that of physical attack. An enemy will try to strike fear into the hearts and minds of its opponent with the hope that terror will weaken it. Modern technology made it possible for fear to be rapidly spread through media, and media played a vital role in spreading propaganda messages during World War II. The U.S. government worked hard to control propaganda, both the enemy’s and its own, but public fear was used as a tool to garner support as well. Justifiable actions of a nation at war, actions which deliberately heightened public fear and restricted civil liberty, seem less justifiable when the war ends but the insecurity remains. After World War II ended, the fear generated by the physical attacks on the nation diminished, but the fear created by the pervasive use of propaganda during the war remained imbedded in the public psyche. History seems to indicate that nations can quickly recover from the physical challenges of war, but the psychological challenges which are often heightened by the use of politically motivated propaganda take much longer to repair. Long after the physical attack becomes just a memory, paranoia and insecurity can linger continuing to justify the restriction of liberty.

End Notes:

[1] Jessie A. Hagen, “U.S. Insecurity in the Twentieth Century: How the Pursuit of National Defense Ingrained a State of National Insecurity,” American Military University, 2014.

Additional Reading:

Conley, Cornlius W. “The Great Japanese Balloon Offensive.” Air University Review XIX, no. 2 (February 1968): 68–83. http://www.airpower.maxwell.af.mil/airchronicles/aureview/1968/jan-feb/conley.html.

Dower, John W. War Without Mercy: Race and Power in the Pacific War. New York: Pantheon, 1986.

Roosevelt, Franklin D. “Executive Order 9066 – Authorizing the Secretary of War to Prescribe Military Areas,” February 19, 1942. Papers of Franklin Roosevelt. The American Presidency Project. http://www.presidency.ucsb.edu/ws/index.php?pid=61698.

———. “Fireside Chat, December 9, 1941,” December 9, 1941. http://www.presidency.ucsb.edu/ws/index.php?pid=16056.

———. “Fireside Chat, February 23, 1942,” February 23, 1942. http://www.presidency.ucsb.edu/ws/index.php?pid=16224.

“Civil Rights.” PBS: The War. Last modified 2007. http://www.pbs.org/thewar/at_home_civil_rights_japanese_american.htm.

“George Takei Describes His Experience in a Japanese Internment Camp.” io9. http://io9.com/george-takei-describes-his-experience-in-a-japanese-int-1533358984.

National Security: the Value of Nutrition and Education

In the years leading up to World War I, many progressive thinkers began to campaign for social reform. The industrial revolution changed society in many ways, not all of which were good for the nation or for national security. Unskilled labor and skilled labor alike were susceptible to the ills of urban life. Just as the war in Europe was igniting, one group of progressive reformers was introducing home economics textbooks and coursework into schools. Proper hygiene and good nutrition began to be taught alongside other subjects. Malnutrition and disease were viewed as ills which not only weakened society but undermined national well-being. The reformers who pushed for better living conditions and education for urban families gained a powerful ally when the United States entered WWI. The ally was the U.S. Army. When faced with a modern war, modern both in weaponry and technologically, the U.S. Army quickly discovered that it was no longer beneficial to go to war with illiterate soldiers. Modern war demanded healthy soldiers and demanded that the soldiers could communicate efficiently with each other. Basic health and literacy became a necessity for the modern army. The ground gained in understanding this truth was not easily won. The soldiers who fought in the war learned firsthand the value of both a healthy body and the ability to communicate with their fellow soldiers. Having a common language coupled with the ability to read and write in it would be something the returning soldiers would seek for their own children. These veterans would push for change. By the end of World War II the realities of modern war mandated the necessity of having a nation populated with citizens possessing basic health and education. Education and proper nutrition became a matter of national security.

Additional Reading:

  • Keene, Jennifer D. Doughboys, the Great War and the Remaking of America. Baltimore: The Johns Hopkins University Press, 2001.
  • National Security Act of 1947, Title X.
  • There were various publications designed to introduce Home Economics in the schools. Some have been scanned and can be found in different e-book collections. Original copies can be found through used bookstores. My favorites were authored by Helen Kinne and Anna M. Cooley.

No Man’s Land

A term older than World War I but popularized during that war, no man’s land refers to a stretch of land under dispute by warring parties, but it can also refer to lawless areas with little or no governing control. A buffer zone, on the other hand, is an area which provides a sense of protection from the enemy. When physical fortifications offer little protection, buffer zones can provide a perception of security. Nations great and small seek the perception of security when security is elusive. Treaties and alliances are traditional means of creating a sense of security, as is the creation of buffer zones. During the Cold War, the competing nations sought to expand their spheres of influence, thereby creating buffer zones between themselves and their enemies as their spheres grew. When the Cold War ended and the buffer zones were no longer needed, many of the buffer nations found themselves with fewer friends and with fewer resources to prevent lawlessness. These nations found it difficult to avoid the development of no man’s land within their borders.

The United States reasoned, even in the earliest days, that oceans made excellent buffer zones against the conflicts of Europe. Unsettled territories were adequate as buffers but only to a point. While unsettled territories didn’t pose a direct European threat, they were still loosely under the influence of powerful countries. Additionally they often attracted outlaws fleeing justice and smugglers seeking a base of operation near their markets. In 1818, Andrew Jackson decided to pursue a group of raiders into Florida. The problem was that Florida was owned by Spain and Spain had little ability to prevent lawlessness in the territory. When Jackson’s army crossed into Florida, he invaded a foreign nation. Without the consent of Spain, such an action created an international incident. Fortunately Secretary of State John Q. Adams was able to capitalize on Jackson’s actions, and convinced Spain that a treaty was better than a war. His reasoning for defending Jackson’s violation of Spanish sovereignty was that “it is better to err on the side of vigor.”[1] Certainly not the first time a nation chose a declaration of strength as its response to an international crisis of its own making, but possibly the first time such a response became national policy. As Secretary of State, Adams greatly influenced the foreign policy decisions of the president and authored much of what President Monroe presented to Congress. In March 1818, President Monroe declared to Congress that when a nation no longer governed in such a way as to prevent their lawlessness from spilling onto their neighbors, then the neighbors had the right to protect themselves and to seek justice even if it meant violating the sovereignty of another nation.[2] In other words, when an area became no man’s land, it was to the benefit of all nations for the lawlessness to be eliminated by whoever had the strength and will to do so.

Eliminating no man’s land in North America was a task that occupied the United States for more than a century. Eventually, the United States would reach from ocean to ocean and would gain the military might of a great nation. However even as the twentieth century dawned, the United States struggled to bring law to all of its territory. During the century of expansion, some in the United States saw potential in the acquisition of territory in the south, particularly in Central America. Others recognized the difficulty of governing such a vast nation. Faced with lawlessness due to revolt in Mexico during World War I, Wilson authorized the U.S. Army’s invasion of Mexico. However, Wilson recognized the value of having a buffer zone south of the border and eventually withdrew the army. In order to ensure that the southern nations created a friendly buffer zone, the United States supported governments that kept the peace, even though keeping the peace came at the expense of basic human rights. Like many leaders before and since, President Wilson put aside ideology and accepted peace-by-force as being better than lawlessness.

Reflecting on history, some leaders have sought security by building huge empires, some by establishing buffer zones, and others by the targeted elimination of no man’s land. Regardless of the method men and nations have chosen, it is clear that international law, notions of liberty and self-determination, and hope for world peace are always secondary to the goal of eliminating the threat posed by no man’s land.

Endnotes:

[1] Samuel Flagg Bemis, John Quincy Adams and the Foundations of American Foreign Policy (New York: Alfred A. Knopf, 1956), 315-316.

[2] James Monroe, “Spain and the Seminole Indians,” American Memory, Library of Congress, (March 25, 1818),  http://memory.loc.gov/cgi-bin/ampage?collId=llsp&fileName=004/llsp004.db&Page=183.

At the End: the Cold War

Twenty-five years ago the Berlin Wall was opened. Unplanned and unauthorized by the powers who controlled the border between east and west, the opening of the Wall signified an end of the Cold War and the beginning of a new era. While the momentous nature of act of opening a gate and letting people pass from east to west gained much attention at the time, other factors had been at play that would pave the way to peace and solidify the end of the Cold War in ways which went relatively unnoticed by the general public. Much has been written on the subject but not by authors with huge public followings. In honor of the twenty-fifth anniversary, we should look back. The following is a short essay* on one aspect of the end of the Cold War – just enough to pique your interest.

Historian John Lewis Gaddis has written that “the Cold War itself was a kind of theater in which distinctions between illusions and reality were not always obvious.”[1] It was fitting then that thespians took the stage for the final act. While is it common knowledge that President Ronald Reagan graced the silver screen in his younger days, it is less well known that other important actors of the final act had theatrical experience prior to their Cold War roles. Mikhail Gorbachev had been an “aspiring actor,” [2] in his youth, and the influential Pope John Paul II “had been an actor before he became a priest.”[3]The success of such actors on the Cold War stage was not due simply to their arrival upon the stage, but due in great part to the stage setting in which they inherited.

As with the origins of the Cold War, the end of the Cold War is not precise. Unlike hot wars which tend to end with the signing of peace treaties and have a clear chain of events preceding peace settlements, the end of the Cold War is ambiguous.  As historian George C. Herring pointed out, there is a myth that Reagan’s strong posturing and rhetoric are the direct cause of Soviet defeat.[4] Yet, to ascribe to such a myth negates the important role of the other actors and for those who set the stage on which the thespian preformed. Gaddis wrote, “it took visionaries – saboteurs of the status quo – to widen the range of historical possibility.”[5] More importantly, it took actors well versed in the art of improvisation, actors who could recognize the changing dynamics of the Cold War and grasp the opportunities of change. While there are numerous scenes to the last act of the Cold War, three key roles were played by each actor.

First, after the Able Archer exercises, President Reagan “drew the obvious – but for Cold War adversaries often elusive – conclusion that the Soviets feared the United States as much as American feared them.”[6] This shift led Reagan to adjust his strategy. While on one hand he ratcheted up the rhetoric, on the other he became more amiable to negotiations because he knew the United States had the upper hand.

Second, Mikhail Gorbachev recognized that public language was not really the same as diplomatic language, and that politicians like Reagan were acting to an audience. While Reagan certainly had an image to keep and a role to play, Gorbachev had an equally, if not more crucial part to play. He had to convince his people that glasnost and perestroika were positive changes, and that negotiations with the West were not signs of weakness.

The third actor, Pope John Paul II, helped “expose disparities between what people believed and the systems under which the Cold War had obliged them to live.”[7]

The Pope’s visit to Poland revealed that the USSR’s satellite enjoyed no popular legitimacy: They were Puppet regimes hated by their subject population. But Pope John Paul went further. He demystified the power of those regimes. With his words, his presence, and his injunction not to feel afraid, the Pope was for a while the real government of Poland.[8]

It would be wrong to assert that the Cold War, even in its final decade, lacked any real cause for fear, but Pope John Paul II diffused the overwhelming and consuming fear that had dominated the public since the days of Stalin.

The three actors took the world stage and improvised rather than continued the Cold War script where the bipolar status quo was viewed as “more stable than multipolar systems.”[9] The final act of the Cold War commenced once these three actors realized that President Roosevelt had been correct that the fear itself was the only thing causing the fear, and that the political divide could be cracked and then normalized once the people stopped feeling the oppression of the fear. While Cold War theatrics occasionally resurfaced, particularly when Reagan gave his famous “tear down this wall” speech in 1987, they did not deter the movement towards normalization between the United States and the Soviet Union.[10] The real success of the final act in the Cold War play is that while tough talk and grand speeches still placated the public perception of strength, changes were occurring specifically within the Soviet Union.  The stage had been set by the policies of containment, “collapse of détente,” the inherent weaknesses of the Soviet system, and mutual overspending on deadly war machines, but the final act was the result of leader desiring a change in the status quo.

* Due to unexpected issues this week, I am recycling an old essay rather than creating something new to commemorate the anniversary of The Fall of the Berlin Wall.

Endnotes:

[1] John Lewis Gaddis, The Cold War: A New History (New York: The Penguin Press, 2005), 195.

[2] George C. Herring, From Colony to Superpower: U.S. Foreign Relations Since 1776 (New York: Oxford University Press, 2008), 894.

[3] Gaddis, 195.

[4] Herring, 894.

[5] Gaddis, 196.

[6] Herring, 896.

[7] Gaddis, 196.

[8] John O’Sullivan, “Warm Cold Warrior,” National Review 57, no. 7 (April 25, 2005): 38.

[9] Gaddis, 196.

[10] Herring, 898.

Man or Machine: War in the 20th Century

World War I changed many facets of warfare, particularly where technology was concerned. The automobile, chemical weapons, and tanks stand out, but also do the airplanes, submarines, and machine guns. Some of these weapons were developed during the war and some were simply advanced beyond their pre-war status. The prevalence of the machine in World War I marked a dramatic shift in how the wars of the twentieth century would differ from the wars in previous centuries. These new machines created mass casualty beyond anything that Europe had ever experienced.

By the end of World War II and the beginning of the Cold War, nations would wonder if machines could be used to replace the soldier, or at least reduce the cost of human life. President Truman considered nuclear technology as a means to reduce ally casualties. After two devastating world wars, the notion of technology carrying the brunt of the work was appealing. The second half of the twentieth century provided opportunities for the theory to be tested. Air power became a key component in strategic planning. Bombardments from the air, whether from aircraft or from missiles located many miles from the conflict zone, devastated communities. All indications seemed to point to a day when machines would replace boots on the ground. However machines did not rout the enemy regardless of the devastation they created. As the twenty-first century dawned, warfare seemed to depart from the oceans and grand battlefields where the machine dominated and instead entered the villages and city streets where man could maneuver more adeptly. Despite all the technological development to the machines of war, man with natural adaptability time-and-time-again remained supreme.

Unexpected Consequensces: Embargoes in the Early 1800s

Thomas Jefferson’s Embargo Act of 1807 did not work out the way he had planned. The restricted flow of British goods entering the United States spurred the development of U.S. manufacturing and changed society. Even the Embargo Act’s replacement the Nonintercourse Act, which allowed for trade with nations other than Great Britain or France, did not halt the changes occurring within the United States. Interestingly, two changes in domestic life resulted because of the embargo and war. These changes were the increased investment into cotton textile manufacturing, and the development of iron, particularly the production of cast iron. Eventually these developments in U.S. manufacturing would provide cheaper textiles for the home, and promote changes in the production of food due to the proliferation of the cooking stove.

In the early 1790s Samuel Slater helped develop the first modern cotton mill in the United States, but it would be two decades later, during the time of embargoes and war, that Francis Cabot Lowell’s textile mill system, which incorporated into one location the production of cotton thread and the finished woven cotton fabric, was built. These changes reduced the cost of production and increased productivity thereby making cotton fabric more available to the average household. Prior to these changes in cotton textile manufacturing, cotton was considered a luxury fabric. Whereas flax could be grown easily and turned into linen by the skilled spinners and weavers in the United States, cotton fabric was typically imported prior to the development U.S. manufacturing in the early 1800s.

In addition to the availability of less expensive textiles, changes in cooking methods were occurring due to the new technology of the cooking stove. While the first modern cooking stoves began appearing in the mid-1700s, it wasn’t until the end of the century that the major flaws had been worked out. Yet the cooking stove remained outside the reach of the average home due to the high cost of cast iron in the United States. The embargoes and the War of 1812 highlighted the need for increased domestic production of iron, and by the 1820s iron production had spread through Pennsylvania with Pittsburg becoming known as the “smoky city.”[1]The greater availability and affordability of cast iron stoves changed the way food was prepared. Not only were the stoves safer than open fires, they allowed the cook a greater range in what they prepared. No longer limited to a stewpot or spit, a woman could prepare a larger variety of food for her family without the need of additional labor in the home.

Thomas Jefferson had opposed men like Alexander Hamilton who had promoted the development of U.S. manufacturing. The ills of industrialization were not unknown to both Jefferson and Hamilton, but while Jefferson preferred to believe that the agrarian lifestyle was superior to manufacturing and would better promote liberty, men like Hamilton understood that economic power would be required to protect that very same liberty. The production of raw materials alone would not be enough to propel the United States to greatness. Without economic greatness, liberty would always be threatened by the dangers of imperialism and war. So while Jefferson’s embargo was meant to pressure the European nations into respecting U.S. sovereignty, it acted as an affirmation that U.S. manufacturing was vital to U.S. survival. The embargoes also helped change domestic life in the United States. Textiles for bedding and clothing became more available and less expensive, and the entire method of cooking was transformed. The U.S. response to international conflict in the early 1800s resulted in the unexpected consequence of increased investment in manufacturing which in turn changed society by transforming life in the home.

[1] Anne Madarasz, “Tracing the Smoky City,” Western Pennsylvania History, 2002, (accessed October 18, 2014),  https://journals.psu.edu/wph/article/viewFile/5111/4894.

War Hawks

In the spring of 1811, a group of young men arrived in Washington, D.C. to fill congressional posts. Led by men like Henry Clay and John C. Calhoun, these new members of congress called for stronger measures in dealing with Europe and in dealing with the American frontier. War was viewed as the answer to problems unresolved by diplomacy or embargo. Europe, it seemed, placed little value on U.S. sovereignty. The wars of Europe threatened U.S. economic stability, and increased the British tendency to confiscate U.S. ships and impress U.S. sailors. On the western frontier, the native population was less friendly to the United States than they were to British Canada, thereby causing the increased worry in the United States that Great Britain might use the natives to further challenge the sovereignty of the United States. Additionally, Spain did little to curb the native raiding parties that caused havoc along the U.S. southern border. The War Hawks, as the new congressmen were called, believed that war was not only inevitable, but also the only practical solution.

Henry Clay stated, “…where is the motive for longer delay? … Our wrongs have been great; our cause is just; and if we are decided and firm, success is inevitable.” He continued with assurances that the United States was not only prepared but that Britain would not bother with another war in America. “The idea is too absurd to merit a moment’s consideration.”[1] By the end of 1814, the British had burned Washington and U.S. leaders in the northeast were discussing succession as a solution to the economic crisis plaguing their region. However it was not battle victory that ended the war which Clay had so eagerly sought, but diplomacy that ended it; ended it before the famous victory of Andrew Jackson at New Orleans. The United States had fought once more to establish its sovereignty but peace was not won by U.S. military prowess, rather once again the British grew weary of war with the Americans and a diplomatic solution was sought by both sides. Clay had been right to some extent when he said the British would not want to fight on American soil, not after such a long battle with Napoleon.

Europe was war weary and the United States, while not strong enough to defeat its armies and navies, was strong enough to make war an unappealing prospect. Additionally, all of Europe was ready to see an end to revolution and the international wars which had caused immeasurable strife for decades. The powers of Europe, under the new Concert of Europe, would go to great lengths to prevent war and would work to create a balance of power that would deter nations from seeking war when dealing with their neighbors. This dedication on their part would afford the United States the opportunity to grow as a nation both in size and strength without being entangled in or harmed by European war.

U.S. politicians dedicated to expansionist’s policies were well aware of the European frustrations with war. In the years following the end of the War of 1812, the United States would capitalize on Europe’s preoccupation with keeping peace at home. It would expand westward and southward. Great Britain, with its powerful navy still intact after decades of war, was the only nation which truly challenged the United States, and the British seemed content to focus on trade rather than colonization in the Americas.

While the U.S. managed to free itself from the machinations of Europe during the War of 1812, there may have been some unexpected consequences. The war may have increased the belief that a weaker nation could defeat a much greater military force simply by wearing down the enemies desire to fight. The United States would become an example that many would emulate in the future, not always to the benefit of the United States or its allies.

Additionally, the war promoted the notion that a just cause made for a successful war. Success was not inevitable as Henry Clay had stated, and in truth, success was anything but inevitable. Military theorist and Prussian general Carl Von Clausewitz wrote that war was a “game in which Time and Chance shuffled the cards; but in its signification it was only diplomacy somewhat intensified, a more vigorous way of negotiating, in which battles and sieges were substituted for diplomatic notes.”[2] In the case of the War of 1812, “time and chance” favored the United States. War may, in the end, be unavoidable, but success in war is never guaranteed regardless of the rhetoric and zeal of people best labeled as War Hawks.

[1] Henry Clay, “Letter in Support of the War of 1812,” 1812, (accessed October 16, 2014), http://teachingamericanhistory.org/library/document/letter-in-support-of-the-war-of-1812/.

[2] Carl von Clausewitz, On War, trans. Dominique Poirier and J. J. Graham, Kindle ed., 2010.