Category Archives: Unexpected Consequences

War: More than Just a Battlefield

John Quincy Adams was just a boy when his father helped create a new nation. His father was not a great soldier, but he was a great philosopher and diplomat. In memoirs and biographies about John Quincy Adams, it is mentioned that he assisted his father during the years the nation was being forged. The young Adams grew into a man who was determined to defend and promote the ideals he was taught while just a lad running errands for the men who were carving out a new nation. While many only know him as a former US president, historians recognize him as the man behind the Monroe Doctrine. Regardless of his age and physical limitations during the War for Independence, John Quincy Adams did experience and participate in war as a boy.

Often times in the modern age, we only think of veterans of war as those who carried a gun and experienced direct combat. In some forums debate rages as to whether a uniformed solider who worked at a desk during a war can be considered a “true” veteran of war; as if participation in war and the repercussions of war only matter when shrapnel is present.

War is not just fighting on battlefields, but it is also aiding those displaced and disrupted by the fighting.

Whether one wears a uniform or not, whether the war is near or far, and regardless of whether we ever see a frontline, war will affect us all. In some cases, like with John Quincy Adams, war will impact the youth in ways we can only hope will lead them to seek a better world.

War is never just waged on a battlefield.

******

For Further Reading:

John Quincy Adams and the Foundations of American Foreign Policy by Samuel Flagg Bemis

John Quincy Adams: A Public Life, A Private Life by Paul C. Nagel

Empty Tributes and Avoiding Change

A recent discussion concerning cultural appropriation has identified an interesting question that needs pondering. Should it be considered an honor to have something attributed to a group, even if the thing in question is not a traditional piece of the group’s culture? Why, therefore, would a group of people be irritated or offended by such an honor, such a tribute? Upon pondering this, another question is arises. Who does the tribute actually benefit – the recipient or the one bestowing the tribute?

The tribute that generated these questions concerns the technique of chain-plying, a yarn spinning technique believed to have existed throughout the world prior to modern history. In the United States, this technique gained the name of “Navajo plying” because the indigenous people, the Diné (or commonly known as the Navajo), were known to use this technique in their weaving. It was not necessarily a traditional spinning technique for them, but rather a way of finishing a woven product. Therefore, it begs the question that wouldn’t referring to the spinning technique as Navajo-plying be incorrect or an empty tribute?

A tribute that is empty, not directly associated with any reason to honor or give acclaim, has inherent problems. Primarily, paying tribute without there being any real justification is often the result of a desire to feel better about how one has treated another. In simple terms, a tribute of this kind is made from a desire to make amends for past and/or present actions ill in nature. Therefore an empty tribute benefits the one bestowing rather than the recipient.  Does this tendency derive from racism? Is it merely a byproduct of colonialism? Can it simply be attributed to the notion that another culture is exotic and desirable? Or is the tendency simply paternalistic in nature – the notion that an honor is being bestowed on a lesser society who should be grateful for the tribute?

Throughout history, society has experienced the clash of cultures. It has also experienced the blending of cultures. Scholars now consider how the cultural blending of the past affects the people of the present. In particular, the question is raised as to whether the cultural blending of the past provides equanimity or discrimination for current members of the society. These considerations, and subsequent calls for change, have caused their own clashes of culture. In recent times tributes, particularly in the form of statues and monuments, have become the catalyst for heated debates and deadly violence. While these tributes may have originated out of differing intent than the empty tribute describe above, when they are challenged the reaction is quite the same. While is surprises few that challenges to statues and monuments associated with historical identity generate conflict, it may surprise many that something as seemingly simple as what people call a spinning technique, generates similar conflict. The heated debate over, and attempt to correct the name of a spinning technique highlights issue: change often causes someone to feel a sense of loss or inconvenience. One would think that by changing to a more universally understood name, one which is of greater descriptive nature and is already in general use, no one would feel a loss. At most, only a small inconvenience might be felt as an individual becomes accustomed to a different name. However, even when change benefits another individual or group, and where the change is of minor inconvenience, the change can generate a sense of loss for some. It can even generate a fear of greater loss. Therefore avoiding change, particularly when it means holding onto empty tributes, seems reasonable to many.

 

Additional Materials:

The Age of Homespun: Objects and Stories in the Creation of and American Myth by Laurel Thatcher Ulrich

Deculturalization and the Struggle for Equality: A Brief History of the Education of the Dominated Cultures in the United States by Joel Spring

Video blog on terms used in fiber arts by Abby Franquemont

 

 

 

 

Home Production and National Defense

One hundred years ago, malnutrition was a problem that worried a nation facing war. Industrialization and urban growth had moved large populations into congested cities and away from rural communities. Both World War I and World War II would see an increase in the urbanization of the United States. The progressive reformers of the early twentieth century recognized that urbanization was leading to an unhealthy population and pushed for reform. They also pushed for vocational education, particularly in the area of what would become known as Home Economics.

One of the great misconceptions of the modern age is that the skills of the preindustrial age were easily passed from generation to generation, and that it is only modern society that struggles with the problems associated with the loss of these skills. Unlike the dissemination of information, knowledge is gained through practice. Skilled crafts and vocations require practice and often a good deal of instruction by a skilled guide. Remove proper training, and the skills are not learned and society struggles. In particular, modern society struggles with issues malnutrition and, more recently, obesity, both of which can be directly linked to a lack of basic knowledge of nutritional food consumption. It could also be argued that the conveniences of modern food production lend to the problems, especially when the issue of ‘prepared’ foods is under discussion. Despite the flood of DIY programs and videos demonstrating cooking and gardening techniques, home production and preparation of food is not as common as needed for a healthy society.

New technology in the early 1900s brought advancements in home food production and storage, but the skills needed to safely process food had to be learned. During the WWI, home canning and food storage was demonstrated and encouraged by divisions of local government and subsidized by the U.S. Department of Agriculture.[1] The Smith-Lever Act and the Smith-Hughes Act are two acts which provided funding for increased training in food production and domestic skills.

According to historian Ruth Schwartz Cowan, the “decade between the end of World War I and the beginning of the depression witnessed the most drastic changes in patterns of household work.”[2] Industrialization was changing the way work was managed, not just in the factories, but also in the homes. Industrialization increased the availability of commodities, many which made household work less time consuming and arduous. Convenience is usually a commodity appreciated, especially by those tasked with managing a household and feeling the pressures of working outside the home. However, the skills that had been learned before convenient options became available were not always passed down to the next generation.  Much like the youth of today, youth of past generations seldom liked learning to do things the old-fashioned way, especially not when new technology and innovation were changing the world. In order to offset the trend and ensure a healthier society, young women in private and public schools were taught the skills that many today assume would have been handed down from mother to daughter. Books titled, Clothing and Health, Shelter and Clothing, Foods and Household Management, and Household Arts for Home and School were produced and marketed to U.S. high schools. In the words of one author, “The authors feel that household arts in high schools should not be confined to problems in cooking and sewing. They are only a part of the study of home making.” In the 1915 edition of Shelter and Clothing, an entire chapter is dedicated to “the water supply and disposal of waste,” and included diagrams of the modern flushable toilet. Technology had changed the lives of many, but progressive minds of the age could see how new technology had to be integrated in to society through education rather than simply leaving society to work through the changes without assistance. World War I, the Great Depression, and World War II jolted policy makers into action. By the mid-1950s, Home Economics, as a high school subject, was accepted as an integral part of keeping the nation healthy and ready for future war. Even as warfare became more mechanized, the nation still held on to a belief that a healthy society was a strong society, and many school systems encouraged both male and female participation in Home Economics during the early 1980s. Unfortunately, the Technological Revolution of the 1990s and 2000s shifted the mindset of many, and like the industrial revolutions of the past, this latest revolution has supplanted convenience over skill. While information is just a click away, the knowledge that comes from skilled instruction is often harder to obtain, placing the nation at risk once more.

 

 

Endnotes

[1] Emily Newell Blair , and United States Council of National Defense. The Woman’s Committee: United States Council of National Defense, An Interpretative Report. April 21, 1917, to February 27, 1919,  e-book  (U.S. Government Printing Office, 1920).

[2] Ruth Schwartz Cowan, “The ‘Industrial Revolution’ in the Home: Household Technology and Social Change in the 20th Century,” Technology and Culture 17, no. 1 (1976): 1–23.

 

Obligated to Intervene

In 1820, the Congress of Troppau was convened. The great powers of the day determined that they held the right to intervene in the revolutionary conflicts of neighboring states. Maintaining the status quo and preventing the spread of nationalism and revolution was viewed as vital in the quest to quell the type of conflict that had erupted in Europe during the French Revolution and the Napoleonic Era. While the beginning of the century had been fraught with what some called the first worldwide war, the remainder of the century saw only regional conflicts, most that were harshly quelled before they could spread outside their borders. However the policy of intervention did not quell nationalism. During the twentieth century nationalism would be at the heart of many conflicts, and the notion that great nations had the right to intervene to protect the status quo would be at the center of international policy for many nations including the United States.

In the case of the United States, intervention became a tool to either protect or disrupt the status quo in a region depending on which was most beneficial to interests of the United States. Intervention often placed the nation at odds with its own revolutionary history and patriotic rhetoric. Despite seeming hypocritical in nature, the United States was not forging new diplomatic patterns but rather following the patterns established by the great powers of the past. The U.S. Founding Fathers may have wanted to distance themselves from the politics and practices of Europe, but their decedents embraced the policies as the United States rose to international supremacy during the twentieth century.

During the rise to superpower status, the United States benefited economically and politically. The right to intervene allowed the United States to protect economic markets, and in some cases add new markets and resources to its growing stock pile. While the nation doggedly denied that it was an empire, by the end of the twentieth century the problems associated with empires began to plague the nation. Most prominently, it could be argued, the United States faced the growing international expectation that it would intervene when conflict threatened a region’s status quo. After a century of gaining prominence and wealth through international intervention, often with the sole goal of protecting resources and markets, the United States found that the right to intervene had transformed into an obligation to intervene.

Liberty: A Cost of War

During war, even a war fought in far flung lands, the civilian public is not guaranteed the comforts of peacetime. Rationing of food and clothing can be expected as a nation directs its energy and material goods toward the war effort. Additionally, one can expect taxation to increase as the nation’s war debt mounts. However, when one’s liberty becomes a cost of war, the nation faces a crisis that is much more difficult to overcome with patriotic slogans. Fear, spread through propaganda campaigns and doom-inspiring rhetoric, becomes the tool that convinces a nation that the loss of constitutionally protected liberty is price worth paying for the ultimate goal of winning the war.

In the mid-to-late 1700s, the cost of war was hugely felt in the form of taxation. Colonial Americans were opposed to the new taxes despite the fact that they helped pay for the military support the colonists benefited from each time a frontier war erupted. Their argument, in simple terms, was that if they were to be taxed like regular English subjects, then they should have all the rights and privileges afforded to regular English subjects. Particularly, they should have the right to political representation. When their demands for equality were not heeded, the colonists decided that rebellion was the solution. War weariness and the costs of war played a large role in the final outcome. Endless war was not a good national policy, and even the powerful British Empire had a difficult time arguing against that truth.

During the American Revolution, the colonists who supported rebellion and sought independence were willing to sacrifice personal comfort for their cause, but that dedication was challenged when the new nation found itself sacrificing economic prosperity due to the Embargo Act of 1807. In an ill-conceived attempt to force France and Great Britain into dealing with the United States with greater respect, President Thomas Jefferson and Congress passed an embargo that resulted in great hardship for the New England merchants. Fortunately, the War of 1812 concluded just as the anger in New England was reaching a boiling point, and President James Madison was not faced with the daunting task of suppressing a homeland rebellion.

When homeland rebellion did finally erupt years later as the national argument concerning the issue of slavery boiled over, President Abraham Lincoln did not hesitate suspending certain constitutionally guaranteed rights in an effort to settle the conflict more quickly. His justification was that those who were trying to separate from the union and those who were a direct threat to the union were not necessarily protected by the constitution. He was not alone in his evaluation that during war certain liberties might need to be curtailed. The remnants of Congress agreed, and passed the Habeas Corpus Suspension Act of 1863.

Economic hardship and the forfeiture of liberty seemed justifiable when the nation was at war; especially if the forfeiture of liberty was directed at those who seemed set on disrupting the nation’s ability to fight the war. It should not come to a surprise that when the nation went to war after the bombing of Pearl Harbor, those who seemed too closely tied to the enemy would find themselves stripped of their constitutionally protected liberty. It mattered little that their ties were familial in nature as opposed to political. The nation had to be protected in order for the United States to prevail. In the end, the war only last a few short years. The rights and liberty of the interned were restored, everyone went on their merry way, and the nation flourished as it helped rebuild the free world. Or so the propagandists proclaimed.

Yet another enemy lurked and another war loomed. Constitutionally protected rights were no longer sacred in the face of an enemy. A nation at war, even a cold one, had to protect itself from enemy sympathizers and subversives. If this meant spying on its own citizens, then that is what the nation would do. When the truth of this violation became publicly known after the burglary at the FBI office in Media, Pennsylvania in 1971, Congress acted to halt such a travesty, but it was questionable even at the time whether the actions of Congress would hold up during the ongoing Cold War.

War, it seemed, would always be a justification for a temporary loss of freedom and liberty, but as the twentieth century ended and the twenty-first century began, war shifted away from the traditional conflicts that often erupted between two political enemies. Instead, war became a conflict with phantoms and ideologies. First there was the War on Drugs and then the War on Terror, both eradicating the protections guaranteed in the constitution, and both without any end in sight. The cost of these wars continues to be great and it seems that rather than causing economic hardship and the sacrifice of personal comfort, these wars demand a greater price – liberty.

Power and Chaos

Prior to the chaos of the French Revolution and Napoleon’s meteoric rise to power, three great powers balanced the Western World: Great Britain, France, and the Ottoman Empire. The Far East and the Americas were still peripheral, with only the United States disrupting the colonial empire system in any fundamental way during the eighteenth century. Throughout the nineteenth century, the three great empires faced ever-growing challenges as nationalistic zeal spread worldwide. In response to the chaos created by the both the French Revolution and the Napoleonic era, the great powers of Great Britain, Austria, Prussia, and Russia chose to form an alliance that they hoped would prevent a repeat of the decades of war. They also redoubled their efforts to contain and control their own territories. The great threat to political stability came from two entities: empire seekers and nationalistic zealots. Control and contain both, and it was believed that chaos could be avoided. Yet as well conceived as the Concert of Europe was for the age, there was an inherent flaw in the concert system. The very nature of forming alliances to prevent imperial expansion or nationalistic revolution also entangled the great nations, and would, in the early twentieth century, lead them into another great international conflict. Fear became the demon; fear of what would happen if a nation chose not to honor the treaties and pacts.

The twentieth century saw the rupture of empires and the colonial system that had made the empires great. While the rupture was often bloody and chaotic, there remained a level of control because as the great empires of the past declined, two even greater empires replaced them. Historians and political scientists argue over whether these two great nations ever became empires in the true sense, or if they were only empires of influence during the second half of the twentieth century. They do, however, agree that the influence of the United States and the Soviet Union during the Cold War suppressed a great deal of the chaos that might have erupted as colonial shackles were lifted and fledgling states emerged as independent nations. As fifty years of Cold War ended, and ended rather unexpectedly and abruptly, the world faced a daunting task of answering the ultimate question. What would come next?

One political scientist suggested an answer to the question. “The great divisions among humankind and the dominating source of conflict will be cultural… the clash of civilizations will dominate global politics.”[1] Unlike the independence movements that plagued international stability in the eighteenth, nineteenth and twentieth century, the twenty-first century has seen a greater surge of culturally driven conflicts, some contained to rhetorical mudslinging, and some violent, bloody, and devastating to the peoples who get in the way of power seeking individuals who achieve dominance through the spread of chaos. The rise in cultural conflict has grown during the last decade and it threatens both stable and week nations alike. It is not limited to the traditionally war-torn regions of the world, and it will take cooperation to counter it. Like the great nations that faced the chaos of the French Revolution and the Napoleonic Wars, the nations of today must find a way to combat this growing crisis; a way that recognizes that the chaos is the goal of the enemy and not simply a byproduct.

 

 

Further Reading

Samuel P. Huntington,  The Clash of Civilizations and the Remaking of World Order (New York: Simon & Schuster, 2011).

 

End Notes

[1] Gideon Rose,  ed. The Clash at 20, E-book (Foreign Affairs, 2013), Foreignaffairs.com.

 

Gowing to War: Purpose and a Plan

In continuation with last week’s post about the study of the motivations of war, I decided to revisit something I wrote a couple years ago.

The Spanish-American War and subsequent Philippine War were short wars by U.S. standards but had far reaching consequences. President McKinley’s “limited war strategy” was intended to gain independence for Cuba but its limited scope also included a limited understanding of the consequences of international conflict.[1] Simply put, the United States was unprepared for war. While the navy was somewhat prepared, the army struggled under continued state and congressional opposition to a strong peacetime military force.[2] As with the American Revolution and the Civil War, untrained volunteers, “who fancied they were soldiers because they could get across a level piece of ground without stepping on their own feet,” were mustered and sent to war with little opportunity for training.[3]

Lack of preparation was one of the issues faced during the “splendid little war.” Of greater issue was the lack of a clear objective for war. If independence was the objective, then it would have seemed logical for the United States to have had greater respect for the native rebels who had worn down the Spanish forces before the U.S. arrival. Rather than respecting and aiding the rebel effort, the United States went from liberator to conqueror and rejected the notion of revolution and self-governance. Rather, the United States implemented a paternalistic imperial rule over the former Spanish colonies. Although there would be efforts at nation building and promises of self-rule, economic and military dependency became the reality.

Whatever goals President McKinley might have had in justifying war, they seem to have gone with him to his grave.[4] While Cuba would achieve a semblance of independence once the war ended, the Philippines would find itself embroiled in further war and facing an arguably unwanted annexation. The United States would become an empire by default more than by plan. McKinley’s little war would also have unexpected, long-term consequences on U.S. military strategy.

The Spanish American War and the Philippine War which created a new empire, would encourage future generations to believe that a guerrilla opposition could be snuffed-out with enough oppression, pacification, and force. While McKinley had not recognized the nature and consequences of international war coupled with imperial occupation, later presidents would justify future international wars based on the perceived successes of these conflicts. Only after it was too late would they realize that occupying islands cut off from allies and supplies was and easier task than occupying lands connected to supply networks. In a time when photographic war journalism was in its infancy, and the atrocities of war could still be ignored by civilians in the United States, pacification policies, total suppression of civilians and combatants, and a torched earth policy could subdue an enemy without public outcry. The United States would learn eventually that people may cry for war when national interests are at risk, but they have little stomach for war or the devastation war brings when faced with the brutal reality of war. Former U.S. secretary of state and retired general Colin Powell once said, “War should be the politics of last resort. And when we go to war, we should have a purpose that our people understand and support.”[5] More importantly, a nation should only go to war when the president understands the clear purpose of the proposed war and has weighed the consequences, short-term and long-term, thoroughly.

Endnotes

[1] Allen R. Millett and Peter Maslowski, For the Common Defense: A Military History of the United States of America. rev exp. (New York: Free Press, 1994), 286.

[2] Ibid., 303.

[3] Ibid., 290.

[4] Brian McAllister Linn, The Philippine War, 1899-1902 (Lawrence, KS: University Press of Kansas, 2000), 3.

[5] Tim Russert, “Powell’s Doctrine, in Powell’s Words.” The Washington Post, October 7, 2001. http://www.mbc.edu/faculty/gbowen/Powell.htm (accessed September 11, 2012).

Unexpected Consequences: Revolution

Prior to the twentieth century, war was most often the product of the elite rather than the common man. Assuredly, war had an impact, both direct and indirect, on the laborer. Whether from conscription, taxation, or proximity to the combat and the combatants, war could wreak havoc. War could also quickly change boundaries and cause forced changes in allegiance. Entire regions could become disputed territory as powerful states weakened and weaker states grew strong. The chaos of the French Revolution and the Napoleonic Wars led the rulers of Europe to seek a balance of power that would prevent the outbreak of wide spread war. For approximately a century they succeeded in quelling the rising nationalistic zeal that threatened to reignite the flames of world war. However, revolutionary ideologies were not contained even as rulers tried to contain revolt. While notions of self-determination, democracy, and equality were discussed by liberal minded thinkers, the ruling class held fast to the notion that not all men were ready or capable of self-rule. In some cases, outright racism was the justification for the continuation of imperial dominance and all the ills that imperialism wrought on subjugated peoples. In other cases, benign paternalism justified policies that increased inequality and protected the status quo. Regardless of the grand rhetoric of the time that promoted equality and brotherhood, paternalistic elitism, the belief that some were better suited to govern than others, remained the consensus of the day.

As the twentieth century dawned, changes in society due to industrialization were creating unrest. The outbreak of World War I ratcheted up the change. Women went to work in greater numbers, particularly women who belonged to the middle class.  Men, who had once been viewed as expendable laborers, became a valuable commodity. Total warfare left no civilian untouched and caused soldiers to question the futility of war. As fighting dragged on and depravation increased, patriotic citizens on the battlefield and home front struggled to find justification for the continued support of a war that seemed less and less justifiable.

In Russia, the seeds of revolution found fertile ground as the citizens lost faith in an old system that seemed to bring endless suffering. Elsewhere the notions of liberty, self-determination, and equality caused subjugated peoples to question why they should remain the possessions of rulers in distant lands rather than be allowed to govern themselves. While Allied nations fought to prevent the invasion, subjugation, and annexation of small nations like Belgium and prevent territorial losses in France, the same nations clung fast to their territorial holdings in other regions of the world. The brutality and futility of total war also caused many within Europe to question whether the empires that governed them did so with any consideration for their needs and their security. Ethnic unrest, nationalistic zeal, and distrust for those with different cultural habits increased as the war continued. The seeds of revolution were cast wide, some to find fertile ground immediately and others to remain dormant for decades, but all to produce the fruit of conflict and bloodshed. Revolution was not the goal of those who declared war in 1914 but revolution was the unexpected consequence.

National Security: the Value of Nutrition and Education

In the years leading up to World War I, many progressive thinkers began to campaign for social reform. The industrial revolution changed society in many ways, not all of which were good for the nation or for national security. Unskilled labor and skilled labor alike were susceptible to the ills of urban life. Just as the war in Europe was igniting, one group of progressive reformers was introducing home economics textbooks and coursework into schools. Proper hygiene and good nutrition began to be taught alongside other subjects. Malnutrition and disease were viewed as ills which not only weakened society but undermined national well-being. The reformers who pushed for better living conditions and education for urban families gained a powerful ally when the United States entered WWI. The ally was the U.S. Army. When faced with a modern war, modern both in weaponry and technologically, the U.S. Army quickly discovered that it was no longer beneficial to go to war with illiterate soldiers. Modern war demanded healthy soldiers and demanded that the soldiers could communicate efficiently with each other. Basic health and literacy became a necessity for the modern army. The ground gained in understanding this truth was not easily won. The soldiers who fought in the war learned firsthand the value of both a healthy body and the ability to communicate with their fellow soldiers. Having a common language coupled with the ability to read and write in it would be something the returning soldiers would seek for their own children. These veterans would push for change. By the end of World War II the realities of modern war mandated the necessity of having a nation populated with citizens possessing basic health and education. Education and proper nutrition became a matter of national security.

Additional Reading:

  • Keene, Jennifer D. Doughboys, the Great War and the Remaking of America. Baltimore: The Johns Hopkins University Press, 2001.
  • National Security Act of 1947, Title X.
  • There were various publications designed to introduce Home Economics in the schools. Some have been scanned and can be found in different e-book collections. Original copies can be found through used bookstores. My favorites were authored by Helen Kinne and Anna M. Cooley.

No Man’s Land

A term older than World War I but popularized during that war, no man’s land refers to a stretch of land under dispute by warring parties, but it can also refer to lawless areas with little or no governing control. A buffer zone, on the other hand, is an area which provides a sense of protection from the enemy. When physical fortifications offer little protection, buffer zones can provide a perception of security. Nations great and small seek the perception of security when security is elusive. Treaties and alliances are traditional means of creating a sense of security, as is the creation of buffer zones. During the Cold War, the competing nations sought to expand their spheres of influence, thereby creating buffer zones between themselves and their enemies as their spheres grew. When the Cold War ended and the buffer zones were no longer needed, many of the buffer nations found themselves with fewer friends and with fewer resources to prevent lawlessness. These nations found it difficult to avoid the development of no man’s land within their borders.

The United States reasoned, even in the earliest days, that oceans made excellent buffer zones against the conflicts of Europe. Unsettled territories were adequate as buffers but only to a point. While unsettled territories didn’t pose a direct European threat, they were still loosely under the influence of powerful countries. Additionally they often attracted outlaws fleeing justice and smugglers seeking a base of operation near their markets. In 1818, Andrew Jackson decided to pursue a group of raiders into Florida. The problem was that Florida was owned by Spain and Spain had little ability to prevent lawlessness in the territory. When Jackson’s army crossed into Florida, he invaded a foreign nation. Without the consent of Spain, such an action created an international incident. Fortunately Secretary of State John Q. Adams was able to capitalize on Jackson’s actions, and convinced Spain that a treaty was better than a war. His reasoning for defending Jackson’s violation of Spanish sovereignty was that “it is better to err on the side of vigor.”[1] Certainly not the first time a nation chose a declaration of strength as its response to an international crisis of its own making, but possibly the first time such a response became national policy. As Secretary of State, Adams greatly influenced the foreign policy decisions of the president and authored much of what President Monroe presented to Congress. In March 1818, President Monroe declared to Congress that when a nation no longer governed in such a way as to prevent their lawlessness from spilling onto their neighbors, then the neighbors had the right to protect themselves and to seek justice even if it meant violating the sovereignty of another nation.[2] In other words, when an area became no man’s land, it was to the benefit of all nations for the lawlessness to be eliminated by whoever had the strength and will to do so.

Eliminating no man’s land in North America was a task that occupied the United States for more than a century. Eventually, the United States would reach from ocean to ocean and would gain the military might of a great nation. However even as the twentieth century dawned, the United States struggled to bring law to all of its territory. During the century of expansion, some in the United States saw potential in the acquisition of territory in the south, particularly in Central America. Others recognized the difficulty of governing such a vast nation. Faced with lawlessness due to revolt in Mexico during World War I, Wilson authorized the U.S. Army’s invasion of Mexico. However, Wilson recognized the value of having a buffer zone south of the border and eventually withdrew the army. In order to ensure that the southern nations created a friendly buffer zone, the United States supported governments that kept the peace, even though keeping the peace came at the expense of basic human rights. Like many leaders before and since, President Wilson put aside ideology and accepted peace-by-force as being better than lawlessness.

Reflecting on history, some leaders have sought security by building huge empires, some by establishing buffer zones, and others by the targeted elimination of no man’s land. Regardless of the method men and nations have chosen, it is clear that international law, notions of liberty and self-determination, and hope for world peace are always secondary to the goal of eliminating the threat posed by no man’s land.

Endnotes:

[1] Samuel Flagg Bemis, John Quincy Adams and the Foundations of American Foreign Policy (New York: Alfred A. Knopf, 1956), 315-316.

[2] James Monroe, “Spain and the Seminole Indians,” American Memory, Library of Congress, (March 25, 1818),  http://memory.loc.gov/cgi-bin/ampage?collId=llsp&fileName=004/llsp004.db&Page=183.