Tag Archives: Patriotism

Looking Forward – Learning from the Past

Seven decades ago, the United States emerged from World War II relatively unscathed compared to other great nations of the world. It found itself in the position to help rebuild, and in doing so it prospered. This prosperity was evident in its purchasing power. A look through the cupboards and attics of our aging population will unearth the evidence of that purchasing power. Crystal and silver tea services, porcelain and fine china, flat ware of the highest quality, and linens too lovely to ever have been used except for the most special occasions. These imported items, often the gifts associated with marriage and new life, were made in recovery zones, and helped reestablish the war-torn markets and industries vital to the lives of those fortunate to have survived a horrific war. These items also confirmed to the U.S. populace that they had fully become a great power, much like the empires that had dominated the century before.

The purchasing power and abundance of the post war era in the United States also provided a balm for hardship from which so many had suffered. The world had been either at war or suffering financial depression for over three decades when WWII ended; an entire generation felt the burden of despair lifted when the industrial and economic potential of the United States was reached post-war. The youth, those too young to have felt the full brunt of hardship, reached adulthood in the glow of economic world domination. This glow was only slightly dimmed by the threat of nuclear war, a threat that increased as they aged but did little to blunt their earning power. War machines equaled economic growth as long as the nation continued to view such development as being vital. Once that view shifted, however, the realities of over extension and taxation created an ever growing sense of waste and loss. The greatness of their youth seemed to have slipped away and in its place, only a sense of uncertainty remained. The Cold War, with all its ills, provided secure jobs and a sense of proactive security. When it ended, a new generation faced the aftermath of war. For them, the balm came in the form of a technology boom, of rapidly falling interest rates, and open borders; these changes provided the American Dream the youth had heard about, but had worried would be outside their reach.

As the twenty-first century dawned, rumblings of change and challenge emerged: first with the Y2K fears and then with the market crash following the September 11 terror. A nation which had for so many years found economic stability in military development and distant wars, once again turned to war as a means to unify and solidify a shaken populace. However, unlike during the Cold War, the United States had lost its standing as superpower. Politically, economically, and militarily – others had risen from the ashes and emerged powerful equals. No longer was the United States seen as the great protector; rather, many saw the United States as a threat to peace. Others questioned if the political system, which had weathered over two hundred years of challenge, would survive the challenges of the new century. Unlike in recent history (the last three hundred years or so) the new century had seen a return of conflict dominated by non-state actors which thereby created a longing for the seemingly stable nature of the Cold War, stable despite its harsh suppression of ethnic conflict, damaging political interference, and costly proxy wars.

This longing for the stability and prosperity of the Cold War provided fuel for the fear, anger, and desperate hope which motivated many as they voted yesterday. The new century has not secured the American Dream for its younger generation; rather, it seems to only have jeopardized it for the older generation. Conservative or liberal, the policies formed in national and state capitals seem, at best, to be bandages rather than sutures. Few anticipated a speedy recovery, but many are willing to risk experimental treatment in the hopes of a miracle cure. The nation should survive from this latest illness, and from the treatment it has chosen; however, it is unlikely that the youth, the youngest voters, will find the balm their parents and grandparents found from an economic boom. Industry, and even much of technology, has gone elsewhere. The borders of nations are closing rather than opening. Peace is threatened as much from the turmoil within as it is from without, and the economy is adversely affected by all the uncertainty. The generations who have suffered the ills and recoveries of the past may be too fatigued to calm the fears and fevers of today’s youth. There simply may be no balm.

History often times seems to be about groups of people working against or for an issue. After destructive wars, terrible depressions, or horrific epidemics, people tend to work together to bring about recovery, with special concern for the young who are always the true hope for a better future. At this time when the ills that face the world are less tangible but no less threatening, it is vital, as we look to history for the lessons taught by groups of people in the past, that we remember the work always began and ended with the individual; the individual who created the cure, who did the work, and who didn’t lose hope. Never did they wash their hands and walk away from the crisis or turn their backs on the young; rather they recognized that the young are, in reality, the key to the stability and prosperity so sought after.

Rumors and Rhetoric

In 1783 at the army camp located in Newburgh, New York rumors of revolt were quelled when General George Washington addressed his men. The rhetoric, which had grown from frustration with Congress over back pay, was effectively countered when Washington spoke, “…let me entreat you, Gentlemen, on your part, not to take any measures, which, viewed in the calm light of reason, will lessen the dignity, and sully the glory you have hitherto maintained…”[1] Scholars have argued over whether the crisis in Newburgh was one of rhetoric only, or if an actual conspiracy existed which threatened the stability and future of the United States.[2] Regardless, the Newburgh Affair highlights how political rhetoric can lead to crisis, and how calm leadership rather than dramatic action can be the solution.

Conspiracy theorists and politically motivated historians have inferred that orchestrated nationalist machinations were the cause of the rumors and implied threats that swirled around Newburgh in the fall and winter of 1782-83. Others argue that frustration at the lack of pay, and the worry of a post-conflict future, organically inspired the rhetoric Washington felt needed addressed on March 15, 1783. Pamphlets, newspapers, public meetings, and personal correspondence were the main vehicles for the spreading of news and the airing of grievances prior to the technological age. The years leading up to the outbreak of war proved that these were effective tools in rousing public opinion in order to force change. It stood to reason then that these same tools would be used when Congress ground to a standstill on the issue of military pay and veteran benefits.

Even in the days before technology transformed the ways in which the world communicated, rumors once started were difficult to suppress. Enflamed rhetoric was even harder to manage for often it was printed and preserved for posterity. Fortunately for the young republic, General Washington was a man who had learned that brash language and rash actions were counter-productive to stability and prosperity. While he understood the frustration[3] of his men, he also understood that a liberty so newly achieved could not withstand civil discord.[4] A nation built from the fire of revolution would have to learn how to handle and even embrace civil discord.; however, Washington was wise in objecting to discord created by “insidious design” and spread by rumor and extreme rhetoric.

 

Endnotes

[1] George Washington, George Washington: Writings, vol. 91, Library of America (New York: Library of America, 1997), 499.

[2] Edward C. Skeen, and Richard H. Kohn, “The Newburgh Conspiracy Reconsidered,” The William and Mary Quarterly 31, no. 2 (1974): 273–298.

[3] Mary Stockwell, “Newburgh Address,” George Washington’s Mount Vernon, http://www.mountvernon.org/research-collections/digital-encyclopedia/article/newburgh-address/.

[4] Washington, 500.

Historiography: How the Story is Written

In the modern world of minute-by-minute news coverage, it is easy to assume that history is being recorded both comprehensively and accurately. One may even think that the role of the historian is passé and all that is needed for the modern world is the analyst who will try to make sense out of current events. Even in a world where the events of the day are documented, and where social media can turn our most mundane activities into a historical sketch that we can share with all of our cyber friends, the role of the historian is crucial. It may even be more crucial than ever before because of the sheer volume of data that must now be shifted through in order to create a comprehensive, yet pertinent, story.

Accuracy in historical record has always been important to historians, but it has not been nearly as important as the story. In the days in which history was borrowed from others in order to bolster a rising nation’s image, accuracy was often less important than fostering the image that a new empire was ancient and eternal in origin. A good example of this is found with the Roman Empire, which having risen in power desired an historical record that would magnify its greatness rather than highlight its youth. Throughout history, political entities as well as powerful individuals have sought to bolster their images by creating histories that connect them to other prestigious historical events, periods, and reigns. By likening themselves to others who were dynamic, successful, dominant, and strong, they create an image of grandeur that is only partially based on the events of their own time and of their own making.

As technology and the availability of the written record evolved over the centuries, it became harder to use the history of others as a means in which one’s own history could be created. Even before the printing press, some historians began comparing their own region’s historical journey with that of their neighbors. In some cases, as with the historian Tacitus, the neighbor was heralded for its purity and simplicity in contrast to the corruption at home. In other cases, the neighbor was villainized in an attempt to deflect attention away from unpopular home-state policy. In either situation, the history of others was borrowed, no longer as a means to explain where a political state had come from, but rather to explain how the home-state compared to others. This trend created an interesting phenomenon in the writing of history. No longer was it simply good enough to extoll the greatness of one’s own past, but now it was acceptable, and even expected to criticize the neighbor as a means of exhausting oneself. By making the neighbor seem less noble or even villainous, the historian could create an even more illustrious history of the home-state.

In the not so distant past, historians were at the whim and will of powerful men who could finance the historical pursuit of the scholar. Modern technology has changed this to some extent. Scholarly history may still be contingent on research grants made possible by powerful institutions and individuals, but technology has made everyone who uses the internet a historian, or at least a historical participant. No longer is it only the famous, powerful, or well-connected who get recorded. Some individuals may only be contributors of data, whereas others may add more significantly to the record of daily events. In this world of high speed technology and vast data collection, history is being recorded more thoroughly than ever before, but that doesn’t mean that the record is any more accurate. Often, history is being recorded in very inaccurate ways and by people with little to no understanding of the ramifications this has on both the people of today as well as the historians of tomorrow. In the race to beat the ‘other guys’ to the best story, accuracy, once again, is secondary to the story being told.

Modern historians bound by ethical parameters of historical accuracy, try to share a story that is comprehensive, and as unbiased as possible. They are taught to question sources and present a full picture of an event, a person, or a period of time. In some cases, they are even taught to use good writing skills in order to make the story enjoyable to read. They are taught to recognize that history is not always pleasant, but it can always be of value, if even to only a few. At times, history can be a story of valor, bravery, and patriotic glory. At other times, history can be just the opposite. The modern historian may write a tale that makes some readers uncomfortable, but the job of the historian is to write a comprehensive and pertinent story rather than the myths and propaganda so many others are determined to write.

History: More Than Just Cramming for a Test

History is a required subject in schools throughout the United States, but is history simply a subject to be covered, crammed, tested, and forgotten? How much do we really know and understood about our own history? Historian Tony Williams asked, “do we really understand the difference between Jamestown and Plymouth? Or between the Declaration of Rights and the Declaration of Independence?”[1] Do we remember more about our elementary Thanksgiving pageants than we do about the actual people and events that shaped our nation and the world in which we live?

Recently, I saw a meme popup on the internet that counseled the readers to not believe revisionist historians, and inferred that they lie in order to strip away the moral fiber of the nation. Clearly, the intent of the statement was to cause distrust in accounts of history that challenge particular points of view, and to breed distrust of academic sources of history as opposed to sensationalized, patriotic versions of history that tend to leave out the controversial bits. Sadly, too many people avoid academic histories because they distrust the historian’s motivation or because they think scholarly history is boring. Contrary to what many believe, scholarly history is not monolithic in nature, and most historians are not set on convincing the public that the celebrated historical characters are all villainous. Rather, academic historians work hard to replace fiction with fact, and separate myth from history. Historian Carol Berkin wrote, “They write about what interests them… [and] firmly reject collective agendas no matter what group suggests them and no matter what pressing problems those agendas might promise to resolve.”[2] The result is that rather than only providing a timeline of the events and peoples of the past, historians have provided greater access to and understanding of the real people and of their lives beyond the grand events of their day. Instead of data to be memorized the night before a test and then quickly forgotten, scholarly history provides a journey back in time, introducing the reader to a diverse world that is much more fascinating than might have ever been discovered in the days when cramming for the test was all that seemed to matter.

 

Endnotes

[1] Tony Williams, America’s Beginnings: The Dramatic Events That Shaped a Nation’s Character (Lanham, MD; Williamsburg, VA: Rowman & Littlefield Publishers, 2010), ix.

[2] Carol Berkin, First Generations: Women in Colonial America (New York: Hill and Wang, 1997), viii.

The Good Old Days

Memory is a tricky thing that tends to filter events by removing the negative aspects from our recollection. When current events are not to our liking, we look to the past and remark on how much better the past was in comparison to the present. While it is also true the positive aspects of an event or period of time can be filtered leaving us with only a bleak recollection of the time, it is more often the case with collective memory that we glorify rather than demonize the past. History, the record and study of that record, helps remove the myth that memory creates.

For many who came to maturity during the 1980s, the decade has come to represent a better time, or in other words, The Good Old Days. The decade is viewed as one where U.S. power and culture was strong and celebrated. The music and clothing were distinctive and memorable. Soft Power was used in conjunction with traditional methods of political power, and the influence of the United States was felt worldwide. The notion that the Cold War was won by forceful rhetoric and the exportation of McDonalds and MTV has resonated with those who now view the 1980s as the glorious decade of U.S. supremacy. While few will argue against the notion that the United States reached a superpower zenith as the twentieth century neared its end, historians will be quick to note that there was more to the decade than glory and power. There was fear – fear of nuclear destruction, fear of pandemic spread of disease, and fear of an ever increasing drug use in mainstream society. However in a decade where politicians could harness the media, or at least greatly influence the script, and where social media was yet unborn, it was easy for the general public to hear the strong rhetoric and believe the message. Imbedded in the rhetoric was the notion that war was the answer to all the ills that plagued the nation. Whether an ideological war with an evil enemy, a hot war often conducted in secrecy, or a war on drugs that often impinged on civil rights but had a moral justification, war was the solution. War was also the solution to a lagging economy. Investment into the machines of war burdened the nation with debt, but it also put people to work and made a select group wealthy in the process. War and power went hand in hand, and those who viewed power as the ultimate evidence of success sought to encourage and perpetuate the notion that only through the constant demonstration of strength could the fears of a nation be quelled. Decades later their efforts have caused many to look back in longing for a better time – a time of strength.

Memory is a tricky thing. Few in the public participated directly in the world changing events of their youth, and fewer still have found a need to crack open the history books to learn more about period of time in which they lived. Historians seek to delve beyond collective memory and search for the data that reveals a greater image of the people and events of a period of time. For those who seek to understand the history rather than the myth of the 1980s, The Good Old Days were days of rhetoric and war, a nation recovering from an economic recession, and a time when money equaled political power. So, in a way, those days are not so dissimilar to the present.

 

 

Further Reading

Chollet, Derek, and James Goldgeier. America Between the Wars: From 11/9 to 9/11; The Misunderstood Years Between the Fall of the Berlin Wall and the Start of the War on Terror. New York: PublicAffairs, 2008.

Gaddis, John Lewis. We Now Know: Rethinking Cold War History. Cambridge, MA: Oxford University Press, 1997.

Leffler, Melvyn P., and Jeffrey W. Legro, eds. In Uncertain Times: American Foreign Policy after the Berlin Wall and 9/11. Ithaca, NY: Cornell University Press, 2011.

Saull, Richard. The Cold War and After: Capitalism, Revolution and Superpower Politics. London: Pluto Press, 2007.

 

Sanitizing the History of War

The study of history can be a wonderful method of instilling patriotism and civic pride into a nation. During the early years of the Cold War, the study of history was viewed as a vital way to instill the notion that the home nation was virtuous and grand, but opposition to a sanitized version of history was growing even as ultra-patriotism became a propaganda tool. Certainly, the sanitization of the history of war did not begin during the Cold War, but during that half century, the sanitized version of history was considered patriotic, and history critical of the homeland was seen by many  as being subversive. Therefore, the shock was profound when footage of war was televised for all to see during the Vietnam War. A generation reared on stories of the noble victories which had defeated tyranny, slavery, totalitarian abuse, and genocide found themselves faced with the horror of war, most for the very first time. Furthermore, war was not noble as they had been told. It was not a clear cut battle between good and evil. It was ambiguous, uncertain, and many times utterly irrational.

The sanitization of history had stripped from collective memory the realities of war. The brutality, the savagery, the rape, and the hunger; all the devastating human suffering had become overshadowed by glorified patriotism. It became easy to believe that the modern rules of war were long rooted in history and only a villainess enemy would commit atrocities against prisoners and civilians. In a sanitized history, it was easy to forget the human suffering of the American Revolution and that such human suffering was generally accepted as part of war.[1] School children had been taught of noble men, of dedicated soldiers who faced frost bite and starvation as they pressed for liberty, and of rag-tagged colonists who changed the world. While it might have been acceptable to sanitize history for the very young, it was problematic to continue with a sanitized version of history for older students. In fact, it led to disillusionment and civil unrest. It also led to backlash against those who tried to rectify the problem and expose the gritty nature of U.S. history.

In 1757, the writings of Maurice de Saxe were published. In his Reveries on the Art of War, he revolutionarily suggested changes to the formation of a modern army. The modern army as we think of it today had not yet been created. Saxe’s writings and the writings of Carl von Clausewitz and Antoine-Henri Jomini would change the way nations formed and utilized armies, however change was a slow process and not universal. When World War II came to a close, the leaders of the great warring nations desired for a universal set of rules that would govern modern war. Yet, they failed to fully comprehend the difficulty of enforcing such rules. Modern war was not to include the savagery and brutality of previous wars, and while bombing citizens was still being debated as an effective means of ending a war more quickly, citizens were otherwise seen as unacceptable targets in war. Rape of civilians was certainly no longer considered an effective war tactic or even a spoil of war. Part of the early appeal of nuclear weapons was that war by technology seemed more humane, at least for the nation in possession of the technology. It was not just history that was being sanitized, but warfare as well.

Unfortunately while the Cold War dominated the news, bloody, violent, ugly war continued in many parts of the world. War had not been sanitized, human suffering had not been eradicated, and the great powers could do little but suppress the violence of war. Peacekeeping efforts managed to suppress multi-national escalation, but seldom suppressed the human suffering historically associated with war. What was often suppressed was the news coverage the realities of war. When stories emerged of horrendous human rights violations during regional or civil wars, it became easy to condemn the perpetrators as savages, ungoverned by the modern rules of war.

Had the history of war not been so sanitized for the general populace of nations like the United States, these realities of war would have been less shocking. War is and has always been horrifying. Terror has always been a part of war. Sadly, for the children reared on the sanitized history and the patriotic rhetoric used during the Cold War, children who are now adults, war became disassociated from terror and horror. War was too often seen as a solution to regional conflict rather than part of the problem.

 

 

Endnotes

[1] Carol Berkin, Revolutionary Mothers: Women in the Struggle for America’s Independence (New York: Vintage, 2006), 41.

Further Reading

Berkin, Carol. Revolutionary Mothers: Women in the Struggle for America’s Independence. Reprint edition. New York: Vintage, 2006.

Clausewitz, Carl von. On War. Translated by Michael Howard and Peter Paret, 2010.

Jomini, Antoine-Henri, Baron de. The Art of War. Translated by G.H. Mendell and W.P. Craighill. Radford, VA: Wilder Publications, 2008.

Pape, Robert A. Bombing to Win: Air Power and Coercion in War. Ithaca, NY: Cornell University Press, 1996.

Saxe, Maurice de. Reveries on the Art of War. Translated by Gen Thomas R. Phillips. Dover Ed. Dover Publications, 2007.

Ideology, Revolution, and Change: A Slow Process

On July 4, 1776 the Declaration of Independence was proclaimed to the people of Philadelphia, “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness…” Eleven years later, the Constitution of the United States of America was created, reaffirming the goal to “…establish Justice, insure domestic Tranquility, provide for the common defence(sic), promote the general Welfare, and secure the Blessings of Liberty…” In 1789, the congress defined twelve common rights of U.S. citizens but only 10 of these became amendments to the constitution. The Bill of Rights defined what the Declaration had not; it defined which rights could be agreed upon as the unalienable rights of man. At the heart of these rights was the belief that sanctity of thought and property were key to liberty.

Beginning in the 1760s, arguments were made that government should not impinge upon these basic rights. Property was not to be surrendered unless it was done so willingly or due to the judgement of one’s peers. It was felt that the forfeiture of property was tantamount to the loss of liberty. While the social strata of the colonies was less structured than in the Old World, property was still closely associated to one’s identity and stature. The loss of property, even from taxation, was considered highly serious in nature. Laws impinging on property rights and laws which changed the colonial judicial system led most often to non-violent protestations. In many cases the laws were repealed, but they were followed by new laws equally objectionable to the colonists. During the decade leading up to the American Revolution and throughout the years of warfare, an ideology emerged that defined political representation as a fundamental right of the citizen. This was not a new ideology, but one that became well-articulated during the numerous debates of the revolutionary period. By the time the U.S. Constitution was drafted, the notion of a government “of the people” was becoming firmly planted in the American psyche. The Preamble stated, “We the people” rather than “We the states”. The new nation was formed with the people being the highest political unit rather than the states. In 1863, during a bloody civil war, President Abraham Lincoln delivered the Gettysburg Address in which he reiterated that the nation was a “government of the people, by the people, for the people”. The American Civil War tested the strength of the constitution and the union it had created. However, it also highlighted that even after more than half century, the ideology that had declared the equality of man and the right to political representation had not become a universal reality in the United States and its territories. It would not be until the twentieth century that all U.S. citizens would gain the right to vote, and the protection to vote without constraint due to the lack of property or social standing.

The American Revolution had not been fought with the intent to change the social dynamics of the people, but the ideology that was established through decades of debate both before and immediately after the Revolution would eventually lead to social change. In the United States this social change was slow, sometimes terribly slow and with human suffering the consequence, but with slow change came stability. While many revolutions would follow in the footsteps of the American Revolution, few of the political entities formed from those revolutions witnessed the longevity and stability of that the United States with its slow and never-ending process of ensuring “Life, Liberty and the pursuit of Happiness” for its people.

Change Came Quickly

In 1918, Fritz Haber was awarded the Nobel Prize in Chemistry. World War I delayed the presentation of the award because Haber was a German scientist, one who had gained the name ‘the father of chemical warfare’. Haber was a patriotic German committed to the German cause, however, less than fifteen years after he was celebrated as a great scientist, he fled his homeland fearing for his life. Fritz Haber was a Jew. He was also an intellectual who too closely associated with a war that had been lost rather than won. Like many other German citizens, Haber discovered that under the right set of circumstances hate could replace friendship with great rapidity. Those circumstances included an economic recession, a turbulent political climate, an abundance persuasive rhetoric, and a highly effective propaganda campaign. In less than two decades, a population who once celebrated Haber’s achievements turned their backs on the evidence that their government had implemented a policy of incarceration and extermination. Race, religious affiliation, sexual orientation, and intellectual interests were more than enough justification for the public to look the other way, or worse join the Nazi agenda. Change came quickly while the public clung to the notion that they were justified in their actions.

U.S. Compulsory Education: Teaching Exceptionalism

During the mid-nineteenth century, states began passing compulsory education laws, and although all states had these laws in place by the time the United States entered World War I, there was still quite a disparity between levels of basic education received by the soldiers. Mobilization efforts during WWI highlighted the need for greater emphasis on education in the United States, but it also highlighted the need to emphasize a common nationality among its citizenry. The war had created a stigma on citizens and immigrants who were too closely related or associated with the enemy. It was felt that the ‘old country’ culture, still held by many, needed to be replaced by a commitment to a less definable, but more patriotic American culture. The desire to eliminate overt connections with European culture, a culture that seemed to instigate war rather than peace, led to strong measures designed to force change in the U.S. population. One measure included the effort to eliminate parochial schools which were viewed as being too closely tied to European culture. When Oregon amended its compulsory education laws in 1922 with the intent to eliminate parochial schools, they faced opposition including a Supreme Court case that ended up ruling against them. It was hoped that public education would transform the population into a more cohesive culture, and while states couldn’t force public school attendance versus private school attendance, over time many states were able to dictate curriculum requirements and achieve the underlying goals sought by legislators during the post-war period.

Many in the United States believed that the nation had a vital responsibility to encourage and spread notions of republican democracy. A growing belief in ‘American exceptionalism’ developed in the post-war years, due in part to wartime propaganda. If the United States was to be exceptional then it needed to guarantee that its public understood what made it exceptional. Accomplishing this task meant that its citizenry needed to understand history, and not just the history of the United States beginning with colonization or independence, but a citizen needed to understand the connection between the United States and ancient history where the foundations of democracy resided. Compulsory education, classes in American History and Western Civilization, and an emphasis on U.S. exceptionalism became the foundation for unifying a nation during the twentieth century.

Myth, Folklore, History, and Nationalistic Pride

Recently the story of “Butch” O’Hare was recounted to a captivated audience.[1] As the tale of bravery came to an end and people reached to wipe their eyes, the thought came to my mind of the important role myth, folklore, and history play in creating nationalistic pride. Two hundred years ago, concerned with the changes technology and urbanization were having on society, two German brothers began to collect folktales. Wilhelm and Jacob Grimm, like other romantics, believed that folktales “were essential for reinvigorating national literatures and saving these literatures from sterile intellectualism.”[2]In 1968, during the height of the revisionist movement, historian Thomas Baily wrote that if the pursuit of history were to “shatter all myths, our social structure would suffer a traumatic shock.” He went on to state, “Historical myths and legends are needful in establishing national identity and stimulating patriotic pride.” [3] During times of societal change and strife, the importance of mythology is heightened and people cling to the stories that make them feel good. Historical precision and factualism is of less importance and can be seen as unpatriotic. During the height of the Cold War, the importance of folklore became an issue of national security. In a heated debate over federal money being used to support the study of folklore, one historian wrote that attempts to stifle the study of folklore could “cripple the efforts of the free world to combat the communist states, who [knew] well how to reach the hidden millions with the shrewd manipulation of folklore, legend, and myth.”[4]

Clearly removing folklore and mythology from the study of history is dangerous to the social structure and unity of a nation. However, the reverse could also hold true. Removing ugly historical facts and social realities from the study of history could be just as dangerous. In a world where technology is creating new communities which ignore national borders and bring together people who were once separated by geography, the promotion of national myth rather than national reality can undermine the success of international efforts to tackle world problems. While not all patriotic, historical reminiscing would be detrimental to international cooperation, jingoistic versions of a nation’s history which clearly whitewash a nation’s less than noble past can harm the nation’s credibility and fuel the fires of hatred that seldom cease to exist in the world. Furthermore, the patriotic rhetoric and reminiscence of national grandeur and exceptionalism often “alienates” a nation’s friends.[5] Myth, folklore, and history can engender nationalistic pride, but it can also become the tool used by a nation’s enemies to rally support for terrorism, even homegrown terrorism. People do not like to be lied to and learning that the noble stories of a nation’s past are not always entirely factual leads to disillusionment. Therefore, a balance must be found wherein the myths, folklore, and history of a nation are all embraced and nationalistic pride is derived from that balance.

Endnotes

[1] Scott Simon, “He Gave His Life For The Nation And His Name To An Airport,” NPR.org, (May 24, 2014),  http://www.npr.org/2014/05/24/315259241/butch-ohare-the-heroic-namesake-of-chicagos-airport.

[2] Elliott Oring, Folk Groups and Folklore Genres: An Introduction (Logan, Utah: Utah State University Press, 1986), 5.

[3] Thomas A. Bailey, “The Mythmakers of American History,” The Journal of American History 55, no. 1 (1968): 5, http://www.jstor.org/stable/1894248.

[4] Richard M. Dorson, “Folklore and the National Defense Education Act,” The Journal of American Folklore 75, no. 296 (April 1962): 164, (accessed July 24, 2013), http://www.jstor.org/stable/538177.

[5] Joseph S. Nye, The Paradox of American Power: Why the World’s Only Superpower Can’t Go It Alone (New York: Oxford University Press, 2003), xiv.