When Chaos Threatens, Diplomacy Struggles

Chaos breeds fear much like an insidious virus; everyone becomes fearful that they will be next to catch it. Segregation is then seen as a positive means of prevention; a measure taken before eradication can commence. Calls for calm and cooperation become drown out by vitriolic shouts for action. It seems that when chaos threatens human cooperation, tact and finesse are the first casualties. Within the world of international cooperation, chaos creates a force against which diplomacy struggles to survive. By the end of World War II, chaos had taken a terrible toll on humanity. Devastating war, multiple pandemics, and a severe economic depression all contributed to a general fatigue which left many seeking strong leadership rather than diplomatic dialogue. The rise of authoritarian leadership should not have surprised many, nor should there have been surprise that some desired isolation. Like in the case of the insidious virus, many felt that segregation from the problem was the logical solution. Others placed their faith in military strength and vitriolic rhetoric. World War II demonstrated that neither segregation nor authoritarian leadership would stop chaos. A terrible truth became evident; the world was too interconnected to ever truly support isolationist policies or prevent the chaos which can derive from authoritarian regimes. However, even as the interconnectedness of the world became an undisputed fact and the vital role of international diplomacy became apparent to those who had once questioned its value, the chaos of a post-WWII world threatened the very cooperation that had brought the war to an end.

World War II had ceased but the suffering caused by war had not. Additionally, the process of decolonization was creating renewed competition for areas of the world which had previously been controlled by foreign powers. A post-colonial world was ripe for chaos, particularly political chaos. The great powers of the day did not wish to see the return of any form of chaos, particularly chaos located in their own back yard. While, the Cold War has been characterized as a war between ideologies, it can also be viewed as a war to eradicate regional chaos. The United States and the Soviet Union both developed international policies which were authoritarian in nature. The nations of the world felt distinct pressure to choose a side. Traditional diplomacy suffered even as the United Nations worked to promote peace through diplomatic means. At the end of the day, pressure in the forms of military posturing and economic support or sanction often dictated international relations more than traditional diplomacy. For nearly fifty years, the United States and Soviet Union managed to keep the chaos from spreading within their own borders. Like with a virus, small outbreaks were to be expected, but the big pandemic was avoided. If chaos was a virus, then the Cold War cure was death to the host if segregation was ineffective. Diplomacy might seem a slow and imperfect treatment for the conflicts that threaten to unleash chaos, but is there truly wisdom in containing chaos through the threat or creation of greater chaos? Some will argue yes while others shudder no, but both should agree that when chaos threatens, diplomacy struggles.

 

Further Reading:

Herring, George C. From Colony to Superpower: U.S. Foreign Relations Since 1776. New York: Oxford University Press, 2008.

Lind, Michael. The American Way of Strategy: U.S. Foreign Policy and the American Way of Life. Oxford University Press, USA, 2006.

Weigley, Russell F. The American Way of War; a History of United States Military Strategy and Policy. New York: Macmillan Publishing Company, 1973.

Zubok, Vladislav M. A Failed Empire: The Soviet Union in the Cold War from Stalin to Gorbachev. Chapel Hill, NC: The University of North Carolina Press, 2009.

American Way of Life and Education during the Cold War

Society is locked in a battle of interpretations when it comes to the Cold War. Was it a war against the sinister spread of communism that threatened the moral fiber and the political existence of the United States, or was it a battle between two economic powers determined to gain world hegemony? Even among historians, the debate rages. Regardless of the underlying goals that fueled the Cold War, one thing remains clear – it was a war both the United States and the Soviet Union were committed to winning. Part of the strategy employed by both powers was the use of education as a means of instilling a common ideology. While the United States would point fingers at the Soviet Union and accuse it of indoctrination rather than education, a real effort to promote an American Way of Life was embarked upon at home. It was also exported in much the same manner as the Soviet exportation of communism.

Unlike with communism, the United States did not have a concise definition that it could promote, but during the decades of the Cold War, an ideology emerged even though it was never capsulized in one definitive form. Movies and television idolized an American Way of Life that often romanticized an ideal version of the United States and its history. Books were written promoting a celebrated notion of Americanism; some warning about the pervasive threats against the United States, and other attempting to define what was un-American and what wasn’t. The American image was molded and promoted at home and abroad.

World War I had highlighted a need for a more educated populace, but the post-World War II era took education into a new realm with the U.S. educational system undergoing a transformation during the Cold War. The study of science and technology increased, and universities often found endless government funding for research and development particularly in areas that were argued as essential for national defense. While higher education benefited from an influx of funds, it was not just the research labs which saw change. In public elementary and secondary schools nationwide, the youth learned civics lessons even as they learned to Duck and Cover. However what may have been the most dramatic change came in the form of racial integration. For a nation proclaiming a dedication to equality and promoting democracy worldwide, segregation, especially the segregation of school children, was a political nightmare. The Supreme Court and the State Department worried that segregation jeopardized national interests and foreign policy. A nation determined to promote and export an American Way of Life needed to eradicate segregation from its narrative, and Brown vs. Board of Education was key to changing that narrative. The United States hoped to put to rest international criticism against a way of life which had supported segregation. A national policy of desegregation, accompanied by film images of the forced desegregation of elementary schools, went far in achieving that goal. In an ideological battle between superpowers, perception was a vital component of strategy. A change in national policy, particularly with regard to education, helped improve the perception that the principle of equality was fundamental to an American Way of Life.

 

 

Further Reading

Dudziak, Mary L. “Brown as a Cold War Case.” The Journal of American History 91, no. 1 (2004): 32–42.

Dudziak, Mary L. “Desegregation as a Cold War Imperative.” Stanford Law Review 41, no. 1 (1988): 61–120.

Isaac, Joel. “The Human Sciences in Cold War America.” The Historical Journal 50, no. 3 (2007): 725–746.

Lind, Michael. The American Way of Strategy: U.S. Foreign Policy and the American Way of Life. Oxford University Press, USA, 2006.

Merelman, Richard M. “Symbols as Substance in National Civics Standards.” PS: Political Science and Politics 29, no. 1 (1996): 53–57.

Reuben, Julie A. “Beyond Politics: Community Civics and the Redefinition of Citizenship in the Progressive Era.” History of Education Quarterly 37, no. 4 (1997): 399–420.

Solomon, Eric. “Cold War U.” American Literary History 11, no. 4 (1999): 721–735.

Solomon, Eric. “Cold War U.” American Literary History 11, no. 4 (1999): 721–735.

Home Production and National Defense

One hundred years ago, malnutrition was a problem that worried a nation facing war. Industrialization and urban growth had moved large populations into congested cities and away from rural communities. Both World War I and World War II would see an increase in the urbanization of the United States. The progressive reformers of the early twentieth century recognized that urbanization was leading to an unhealthy population and pushed for reform. They also pushed for vocational education, particularly in the area of what would become known as Home Economics.

One of the great misconceptions of the modern age is that the skills of the preindustrial age were easily passed from generation to generation, and that it is only modern society that struggles with the problems associated with the loss of these skills. Unlike the dissemination of information, knowledge is gained through practice. Skilled crafts and vocations require practice and often a good deal of instruction by a skilled guide. Remove proper training, and the skills are not learned and society struggles. In particular, modern society struggles with issues malnutrition and, more recently, obesity, both of which can be directly linked to a lack of basic knowledge of nutritional food consumption. It could also be argued that the conveniences of modern food production lend to the problems, especially when the issue of ‘prepared’ foods is under discussion. Despite the flood of DIY programs and videos demonstrating cooking and gardening techniques, home production and preparation of food is not as common as needed for a healthy society.

New technology in the early 1900s brought advancements in home food production and storage, but the skills needed to safely process food had to be learned. During the WWI, home canning and food storage was demonstrated and encouraged by divisions of local government and subsidized by the U.S. Department of Agriculture.[1] The Smith-Lever Act and the Smith-Hughes Act are two acts which provided funding for increased training in food production and domestic skills.

According to historian Ruth Schwartz Cowan, the “decade between the end of World War I and the beginning of the depression witnessed the most drastic changes in patterns of household work.”[2] Industrialization was changing the way work was managed, not just in the factories, but also in the homes. Industrialization increased the availability of commodities, many which made household work less time consuming and arduous. Convenience is usually a commodity appreciated, especially by those tasked with managing a household and feeling the pressures of working outside the home. However, the skills that had been learned before convenient options became available were not always passed down to the next generation.  Much like the youth of today, youth of past generations seldom liked learning to do things the old-fashioned way, especially not when new technology and innovation were changing the world. In order to offset the trend and ensure a healthier society, young women in private and public schools were taught the skills that many today assume would have been handed down from mother to daughter. Books titled, Clothing and Health, Shelter and Clothing, Foods and Household Management, and Household Arts for Home and School were produced and marketed to U.S. high schools. In the words of one author, “The authors feel that household arts in high schools should not be confined to problems in cooking and sewing. They are only a part of the study of home making.” In the 1915 edition of Shelter and Clothing, an entire chapter is dedicated to “the water supply and disposal of waste,” and included diagrams of the modern flushable toilet. Technology had changed the lives of many, but progressive minds of the age could see how new technology had to be integrated in to society through education rather than simply leaving society to work through the changes without assistance. World War I, the Great Depression, and World War II jolted policy makers into action. By the mid-1950s, Home Economics, as a high school subject, was accepted as an integral part of keeping the nation healthy and ready for future war. Even as warfare became more mechanized, the nation still held on to a belief that a healthy society was a strong society, and many school systems encouraged both male and female participation in Home Economics during the early 1980s. Unfortunately, the Technological Revolution of the 1990s and 2000s shifted the mindset of many, and like the industrial revolutions of the past, this latest revolution has supplanted convenience over skill. While information is just a click away, the knowledge that comes from skilled instruction is often harder to obtain, placing the nation at risk once more.

 

 

Endnotes

[1] Emily Newell Blair , and United States Council of National Defense. The Woman’s Committee: United States Council of National Defense, An Interpretative Report. April 21, 1917, to February 27, 1919,  e-book  (U.S. Government Printing Office, 1920).

[2] Ruth Schwartz Cowan, “The ‘Industrial Revolution’ in the Home: Household Technology and Social Change in the 20th Century,” Technology and Culture 17, no. 1 (1976): 1–23.

 

Not Naive

In January 1789, the newly elected President George Washington wrote to his dear friend, Marquis de Lafayette, the following words.

While you are quarreling among yourselves in Europe – while one King is running mad – and other acting as if they were already so, but cutting the throats of the subjects of their neighbours; I think you need not doubt, My Dear Marquis we shall continue in tranquility here – And that population will be progressive so long as there shall continue to be so many easy means for obtaining a subsistence, and so ample a field for the exertion of talents and industry.

Washington, like so many of his countrymen, saw the American abundance of land and resources as a way to ensure the avoidance of foreign chaos, specifically the chaos that derives from overcrowding and the ills such chaos inspires. He wrote, “I see a path, as clear and as direct as a ray of light…Nothing but harmony, honesty, industry, and frugality are necessary to make us a great and happy people.”[1]

Men like Washington felt strongly that certain key moral principles would flourish in a land as abundantly blessed as America. As a leader of men for most of his adult life, he would not have been blind to the tendencies of human nature, but clearly he believed that those men dedicated to “industry and frugality” would prevail over those who sought slothful pursuits.  The United States was predominantly agrarian during those early years. Commerce, especially the trade of raw materials for finished goods, may have dominated the sea side areas of the new nation, but industrialization had not yet lured workers from the fields and into cities. Subsistence farming was still both the predominant occupation and an occupation that did not tolerate slothful pursuits. Washington was able to envision generations of “tranquility” rather than the chaos that derived from congested cities and limited resources. However, he was not naive to the realities of human nature; he simply could not foresee how quickly the world would change once industrialization took hold.

[1] George Washington, George Washington: Writings, vol. 91, Library of America (New York: Library of America, 1997), 717 – 718.

Rumors and Rhetoric

In 1783 at the army camp located in Newburgh, New York rumors of revolt were quelled when General George Washington addressed his men. The rhetoric, which had grown from frustration with Congress over back pay, was effectively countered when Washington spoke, “…let me entreat you, Gentlemen, on your part, not to take any measures, which, viewed in the calm light of reason, will lessen the dignity, and sully the glory you have hitherto maintained…”[1] Scholars have argued over whether the crisis in Newburgh was one of rhetoric only, or if an actual conspiracy existed which threatened the stability and future of the United States.[2] Regardless, the Newburgh Affair highlights how political rhetoric can lead to crisis, and how calm leadership rather than dramatic action can be the solution.

Conspiracy theorists and politically motivated historians have inferred that orchestrated nationalist machinations were the cause of the rumors and implied threats that swirled around Newburgh in the fall and winter of 1782-83. Others argue that frustration at the lack of pay, and the worry of a post-conflict future, organically inspired the rhetoric Washington felt needed addressed on March 15, 1783. Pamphlets, newspapers, public meetings, and personal correspondence were the main vehicles for the spreading of news and the airing of grievances prior to the technological age. The years leading up to the outbreak of war proved that these were effective tools in rousing public opinion in order to force change. It stood to reason then that these same tools would be used when Congress ground to a standstill on the issue of military pay and veteran benefits.

Even in the days before technology transformed the ways in which the world communicated, rumors once started were difficult to suppress. Enflamed rhetoric was even harder to manage for often it was printed and preserved for posterity. Fortunately for the young republic, General Washington was a man who had learned that brash language and rash actions were counter-productive to stability and prosperity. While he understood the frustration[3] of his men, he also understood that a liberty so newly achieved could not withstand civil discord.[4] A nation built from the fire of revolution would have to learn how to handle and even embrace civil discord.; however, Washington was wise in objecting to discord created by “insidious design” and spread by rumor and extreme rhetoric.

 

Endnotes

[1] George Washington, George Washington: Writings, vol. 91, Library of America (New York: Library of America, 1997), 499.

[2] Edward C. Skeen, and Richard H. Kohn, “The Newburgh Conspiracy Reconsidered,” The William and Mary Quarterly 31, no. 2 (1974): 273–298.

[3] Mary Stockwell, “Newburgh Address,” George Washington’s Mount Vernon, http://www.mountvernon.org/research-collections/digital-encyclopedia/article/newburgh-address/.

[4] Washington, 500.

Military Superiority Leads to Decline

As the twentieth century ended and the specter of the Cold War appeared to be fading into history, political scientists pondered the question of how a new world order would take shape under the direction of a victorious superpower. As John Ikenberry stated, victors try “to find ways to set limits on their powers and make it acceptable to other states.”[1] The United States, having spent a century building its image as military power determined to protect the world from evil and in doing so spread democracy, found itself in a dilemma. While talking heads and braggarts proclaimed U.S. superpower greatness, diplomats faced the harsh reality that yesterday’s protector can quickly become today’s bully and tomorrow’s enemy. Additionally, the economic strain military spending places on a society can become politically detrimental once victory occurs. In the past it was said that to the victor goes the spoils, but in modern times with plundering being frowned upon, the victor tends to win a headache both at home and abroad without seeing any real benefit. Without change in policy, particularly policy pertaining to its military superiority and status, a victorious nation discovers that military superiority can lead to economic and political decline.

Of the many headaches the United States developed as a single superpower in the years following the end of the Cold War, probably the most contentious one was the headache of being asked to intervene in conflicts great and small. Seldom was there a clear right side and wrong side to support. In many cases the crises that prompted the debate over intervention occurred in regions that had been previously under the political, economic, and military supervision of the Soviet Union. Even when using the umbrella of the United Nations, U.S. intervention could stir conflicting emotions in the crisis region. The United States had been both the enemy and possessor of enviable commodities for fifty years. Envy and distrust were not feelings easily eradicated simply because war was over. In a world that seemed to be rupturing in the absence of Cold War superpower dominance, the United States struggled with its expanded role of policeman, banker, and social worker.

Military dominance, which had spurred the U.S. economy in the years following World War II, became a burden following the end of the Cold War. In the wake of international cooperation and the perception of peace, nations could shift away from military technology as a basis of economic growth. Nations which remained entrenched in military development became economically dependent on wars that seldom required Cold War technology. Furthermore, Cold War technology had been all about fighting a war from a distance, and the conflicts of the twenty-first century required boots on the ground. When President Truman and President Eisenhower put their support behind the development of nuclear technology and behind the technology to deliver nuclear weapons from a distance, part of their justification was that it would save U.S. casualty and hypothetically shorten, if not prevent war. Their reasoning was based predominantly on the notion that nations would fight nations, and that the days of tribal warfare were becoming part of the past. When the theories and perception of modern war shifted after the attacks on the United States in 2001, the world powers seemed taken by surprise. When the Second Gulf War did not produce the results predicted, when peace did not flourish, and when terrorism spread rather than diminished, the United States seemed not only surprised but confused. The U.S. war strategy and military development, so honed during the twentieth century, did not work in the twenty-first. A nation which had grown powerful through military superiority, found itself the targeted enemy rather than the celebrated hero. Furthermore, it found itself struggling to justify increasing national debt, made larger due to wars that seemed to have no end. Like many great powers which had come before, the United States faced decline despite continued military superiority. In fact, it could be argued, the United States faced decline because of its military superiority.

 

 

Endotes

[1] John G. Ikenberry, After Victory: Institutions, Strategic Restraint, and the Rebuilding of Order after Major Wars (Princeton: Princeton University Press, 2001), xi.

 

Further Reading

Hixson, Walter L. The Myth of American Diplomacy: National Identity and U.S. Foreign Policy. New Haven, CT: Yale University Press, 2008.

Kennedy, Paul. The Rise and Fall of the Great Powers. New York: Vintage Books, 1989.

Strong Military as a Path to Prosperity

There is a belief held by many that a strong nation can ensure stability and can promote prosperity by developing a strong military presence in a region. It is not a new theory nor is it difficult to validate when history is full of examples of empires formed by military strength who then add to their own prosperity through the quelling of regional conflict and instability. In fact, it is much easier to cite examples of empires made strong by force than by diplomacy; therefore, it should be of no surprise that the United States followed a similar path as it sought to expand its economic interests during the late nineteenth and early twentieth century.

What might be surprising, especially after the fact that the United States went on to flex its military might for the greater part of the twentieth century, is that there had been fierce opposition within the United States to the notion of militarizing, taking on the role of stabilizer and protector, and pursuing the status of empire.[1] Even during the years following the Monroe Doctrine many argued that the United States needed to simply concentrate on the lands of North American and leave the affairs of Europe to the Europeans. However, these well intended notions of independence and isolation failed take into consideration that sea trade could not be ‘free’ or ‘secure’ unless someone policed the waters. The United States was comfortable allowing the British Navy the job even though the British posed the greatest threat to U.S. interests at the time. However, by the end of the nineteenth century more U.S. voices were calling for a change. One of these voices was that of Alfred Thayer Mahan who wrote, “All men seek gain and, more or less, love money; but the way in which gain is sought will have a marked effect upon the commercial fortunes and the history of the people inhabiting a country.”[2] He argued that for economic gain to increase, sea trade must be protected, and rather than relying on the naval strength of others, the United States must get into the game and become a naval power. A few short years after he made his argument, the United States acquired territories and increased its markets overseas.  A larger navy was required.

When faced with questions and criticism concerning the appearance of imperial objectives, President Theodore Roosevelt responded, “When the Constitution was adopted, at the end of the eighteenth century, no human wisdom could foretell the sweeping changes, alike in industrial and political conditions, which were to take place by the beginning of the twentieth century.”[3] A few years later he would assure the critics, “All that this country desires is to see the neighboring countries stable, orderly, and prosperous.”[4] Whether Roosevelt was genuine in his assurances or whether he was fully aware that the nation was heading down an imperial path is debatable, but one thing that has been clear from that point forward – the United States was no longer theoretically a regional power but had become one in reality. During the next two decades, the United States would transition from regional power to world power and the transition would happen through the use of military might.

 

[1] Many will argue that the United States never pursued or achieved the status of empire. They will claim that the United States assimilated and incorporated territories rather than acquired colonies and that the peoples of the territories were treated as citizens rather than as subjugated peoples. The debate on the question of whether the United States is or was an empire can be quite interesting to follow.

[2] Mahan, Alfred Thayer. The Influesnce of Sea Power Upon History, 1660-1783 (1890), Kindle.

[3] Theodore Roosevelt, “First Annual Message,” Presidential Speech Archive, Miller Center, University of Virginia, (December 3, 1901), http://millercenter.org/president/speeches/detail/3773.

[4] Theodore Roosevelt, “Forth Annual Message,” Presidential Speech Archive, Miller Center, University of Virginia (December 6, 1904), http://millercenter.org/president/speeches/detail/3776.

Obligated to Intervene

In 1820, the Congress of Troppau was convened. The great powers of the day determined that they held the right to intervene in the revolutionary conflicts of neighboring states. Maintaining the status quo and preventing the spread of nationalism and revolution was viewed as vital in the quest to quell the type of conflict that had erupted in Europe during the French Revolution and the Napoleonic Era. While the beginning of the century had been fraught with what some called the first worldwide war, the remainder of the century saw only regional conflicts, most that were harshly quelled before they could spread outside their borders. However the policy of intervention did not quell nationalism. During the twentieth century nationalism would be at the heart of many conflicts, and the notion that great nations had the right to intervene to protect the status quo would be at the center of international policy for many nations including the United States.

In the case of the United States, intervention became a tool to either protect or disrupt the status quo in a region depending on which was most beneficial to interests of the United States. Intervention often placed the nation at odds with its own revolutionary history and patriotic rhetoric. Despite seeming hypocritical in nature, the United States was not forging new diplomatic patterns but rather following the patterns established by the great powers of the past. The U.S. Founding Fathers may have wanted to distance themselves from the politics and practices of Europe, but their decedents embraced the policies as the United States rose to international supremacy during the twentieth century.

During the rise to superpower status, the United States benefited economically and politically. The right to intervene allowed the United States to protect economic markets, and in some cases add new markets and resources to its growing stock pile. While the nation doggedly denied that it was an empire, by the end of the twentieth century the problems associated with empires began to plague the nation. Most prominently, it could be argued, the United States faced the growing international expectation that it would intervene when conflict threatened a region’s status quo. After a century of gaining prominence and wealth through international intervention, often with the sole goal of protecting resources and markets, the United States found that the right to intervene had transformed into an obligation to intervene.

Historiography: How the Story is Written

In the modern world of minute-by-minute news coverage, it is easy to assume that history is being recorded both comprehensively and accurately. One may even think that the role of the historian is passé and all that is needed for the modern world is the analyst who will try to make sense out of current events. Even in a world where the events of the day are documented, and where social media can turn our most mundane activities into a historical sketch that we can share with all of our cyber friends, the role of the historian is crucial. It may even be more crucial than ever before because of the sheer volume of data that must now be shifted through in order to create a comprehensive, yet pertinent, story.

Accuracy in historical record has always been important to historians, but it has not been nearly as important as the story. In the days in which history was borrowed from others in order to bolster a rising nation’s image, accuracy was often less important than fostering the image that a new empire was ancient and eternal in origin. A good example of this is found with the Roman Empire, which having risen in power desired an historical record that would magnify its greatness rather than highlight its youth. Throughout history, political entities as well as powerful individuals have sought to bolster their images by creating histories that connect them to other prestigious historical events, periods, and reigns. By likening themselves to others who were dynamic, successful, dominant, and strong, they create an image of grandeur that is only partially based on the events of their own time and of their own making.

As technology and the availability of the written record evolved over the centuries, it became harder to use the history of others as a means in which one’s own history could be created. Even before the printing press, some historians began comparing their own region’s historical journey with that of their neighbors. In some cases, as with the historian Tacitus, the neighbor was heralded for its purity and simplicity in contrast to the corruption at home. In other cases, the neighbor was villainized in an attempt to deflect attention away from unpopular home-state policy. In either situation, the history of others was borrowed, no longer as a means to explain where a political state had come from, but rather to explain how the home-state compared to others. This trend created an interesting phenomenon in the writing of history. No longer was it simply good enough to extoll the greatness of one’s own past, but now it was acceptable, and even expected to criticize the neighbor as a means of exhausting oneself. By making the neighbor seem less noble or even villainous, the historian could create an even more illustrious history of the home-state.

In the not so distant past, historians were at the whim and will of powerful men who could finance the historical pursuit of the scholar. Modern technology has changed this to some extent. Scholarly history may still be contingent on research grants made possible by powerful institutions and individuals, but technology has made everyone who uses the internet a historian, or at least a historical participant. No longer is it only the famous, powerful, or well-connected who get recorded. Some individuals may only be contributors of data, whereas others may add more significantly to the record of daily events. In this world of high speed technology and vast data collection, history is being recorded more thoroughly than ever before, but that doesn’t mean that the record is any more accurate. Often, history is being recorded in very inaccurate ways and by people with little to no understanding of the ramifications this has on both the people of today as well as the historians of tomorrow. In the race to beat the ‘other guys’ to the best story, accuracy, once again, is secondary to the story being told.

Modern historians bound by ethical parameters of historical accuracy, try to share a story that is comprehensive, and as unbiased as possible. They are taught to question sources and present a full picture of an event, a person, or a period of time. In some cases, they are even taught to use good writing skills in order to make the story enjoyable to read. They are taught to recognize that history is not always pleasant, but it can always be of value, if even to only a few. At times, history can be a story of valor, bravery, and patriotic glory. At other times, history can be just the opposite. The modern historian may write a tale that makes some readers uncomfortable, but the job of the historian is to write a comprehensive and pertinent story rather than the myths and propaganda so many others are determined to write.

Liberty: A Cost of War

During war, even a war fought in far flung lands, the civilian public is not guaranteed the comforts of peacetime. Rationing of food and clothing can be expected as a nation directs its energy and material goods toward the war effort. Additionally, one can expect taxation to increase as the nation’s war debt mounts. However, when one’s liberty becomes a cost of war, the nation faces a crisis that is much more difficult to overcome with patriotic slogans. Fear, spread through propaganda campaigns and doom-inspiring rhetoric, becomes the tool that convinces a nation that the loss of constitutionally protected liberty is price worth paying for the ultimate goal of winning the war.

In the mid-to-late 1700s, the cost of war was hugely felt in the form of taxation. Colonial Americans were opposed to the new taxes despite the fact that they helped pay for the military support the colonists benefited from each time a frontier war erupted. Their argument, in simple terms, was that if they were to be taxed like regular English subjects, then they should have all the rights and privileges afforded to regular English subjects. Particularly, they should have the right to political representation. When their demands for equality were not heeded, the colonists decided that rebellion was the solution. War weariness and the costs of war played a large role in the final outcome. Endless war was not a good national policy, and even the powerful British Empire had a difficult time arguing against that truth.

During the American Revolution, the colonists who supported rebellion and sought independence were willing to sacrifice personal comfort for their cause, but that dedication was challenged when the new nation found itself sacrificing economic prosperity due to the Embargo Act of 1807. In an ill-conceived attempt to force France and Great Britain into dealing with the United States with greater respect, President Thomas Jefferson and Congress passed an embargo that resulted in great hardship for the New England merchants. Fortunately, the War of 1812 concluded just as the anger in New England was reaching a boiling point, and President James Madison was not faced with the daunting task of suppressing a homeland rebellion.

When homeland rebellion did finally erupt years later as the national argument concerning the issue of slavery boiled over, President Abraham Lincoln did not hesitate suspending certain constitutionally guaranteed rights in an effort to settle the conflict more quickly. His justification was that those who were trying to separate from the union and those who were a direct threat to the union were not necessarily protected by the constitution. He was not alone in his evaluation that during war certain liberties might need to be curtailed. The remnants of Congress agreed, and passed the Habeas Corpus Suspension Act of 1863.

Economic hardship and the forfeiture of liberty seemed justifiable when the nation was at war; especially if the forfeiture of liberty was directed at those who seemed set on disrupting the nation’s ability to fight the war. It should not come to a surprise that when the nation went to war after the bombing of Pearl Harbor, those who seemed too closely tied to the enemy would find themselves stripped of their constitutionally protected liberty. It mattered little that their ties were familial in nature as opposed to political. The nation had to be protected in order for the United States to prevail. In the end, the war only last a few short years. The rights and liberty of the interned were restored, everyone went on their merry way, and the nation flourished as it helped rebuild the free world. Or so the propagandists proclaimed.

Yet another enemy lurked and another war loomed. Constitutionally protected rights were no longer sacred in the face of an enemy. A nation at war, even a cold one, had to protect itself from enemy sympathizers and subversives. If this meant spying on its own citizens, then that is what the nation would do. When the truth of this violation became publicly known after the burglary at the FBI office in Media, Pennsylvania in 1971, Congress acted to halt such a travesty, but it was questionable even at the time whether the actions of Congress would hold up during the ongoing Cold War.

War, it seemed, would always be a justification for a temporary loss of freedom and liberty, but as the twentieth century ended and the twenty-first century began, war shifted away from the traditional conflicts that often erupted between two political enemies. Instead, war became a conflict with phantoms and ideologies. First there was the War on Drugs and then the War on Terror, both eradicating the protections guaranteed in the constitution, and both without any end in sight. The cost of these wars continues to be great and it seems that rather than causing economic hardship and the sacrifice of personal comfort, these wars demand a greater price – liberty.