Category Archives: Current Affairs

Looking Forward – Learning from the Past

Seven decades ago, the United States emerged from World War II relatively unscathed compared to other great nations of the world. It found itself in the position to help rebuild, and in doing so it prospered. This prosperity was evident in its purchasing power. A look through the cupboards and attics of our aging population will unearth the evidence of that purchasing power. Crystal and silver tea services, porcelain and fine china, flat ware of the highest quality, and linens too lovely to ever have been used except for the most special occasions. These imported items, often the gifts associated with marriage and new life, were made in recovery zones, and helped reestablish the war-torn markets and industries vital to the lives of those fortunate to have survived a horrific war. These items also confirmed to the U.S. populace that they had fully become a great power, much like the empires that had dominated the century before.

The purchasing power and abundance of the post war era in the United States also provided a balm for hardship from which so many had suffered. The world had been either at war or suffering financial depression for over three decades when WWII ended; an entire generation felt the burden of despair lifted when the industrial and economic potential of the United States was reached post-war. The youth, those too young to have felt the full brunt of hardship, reached adulthood in the glow of economic world domination. This glow was only slightly dimmed by the threat of nuclear war, a threat that increased as they aged but did little to blunt their earning power. War machines equaled economic growth as long as the nation continued to view such development as being vital. Once that view shifted, however, the realities of over extension and taxation created an ever growing sense of waste and loss. The greatness of their youth seemed to have slipped away and in its place, only a sense of uncertainty remained. The Cold War, with all its ills, provided secure jobs and a sense of proactive security. When it ended, a new generation faced the aftermath of war. For them, the balm came in the form of a technology boom, of rapidly falling interest rates, and open borders; these changes provided the American Dream the youth had heard about, but had worried would be outside their reach.

As the twenty-first century dawned, rumblings of change and challenge emerged: first with the Y2K fears and then with the market crash following the September 11 terror. A nation which had for so many years found economic stability in military development and distant wars, once again turned to war as a means to unify and solidify a shaken populace. However, unlike during the Cold War, the United States had lost its standing as superpower. Politically, economically, and militarily – others had risen from the ashes and emerged powerful equals. No longer was the United States seen as the great protector; rather, many saw the United States as a threat to peace. Others questioned if the political system, which had weathered over two hundred years of challenge, would survive the challenges of the new century. Unlike in recent history (the last three hundred years or so) the new century had seen a return of conflict dominated by non-state actors which thereby created a longing for the seemingly stable nature of the Cold War, stable despite its harsh suppression of ethnic conflict, damaging political interference, and costly proxy wars.

This longing for the stability and prosperity of the Cold War provided fuel for the fear, anger, and desperate hope which motivated many as they voted yesterday. The new century has not secured the American Dream for its younger generation; rather, it seems to only have jeopardized it for the older generation. Conservative or liberal, the policies formed in national and state capitals seem, at best, to be bandages rather than sutures. Few anticipated a speedy recovery, but many are willing to risk experimental treatment in the hopes of a miracle cure. The nation should survive from this latest illness, and from the treatment it has chosen; however, it is unlikely that the youth, the youngest voters, will find the balm their parents and grandparents found from an economic boom. Industry, and even much of technology, has gone elsewhere. The borders of nations are closing rather than opening. Peace is threatened as much from the turmoil within as it is from without, and the economy is adversely affected by all the uncertainty. The generations who have suffered the ills and recoveries of the past may be too fatigued to calm the fears and fevers of today’s youth. There simply may be no balm.

History often times seems to be about groups of people working against or for an issue. After destructive wars, terrible depressions, or horrific epidemics, people tend to work together to bring about recovery, with special concern for the young who are always the true hope for a better future. At this time when the ills that face the world are less tangible but no less threatening, it is vital, as we look to history for the lessons taught by groups of people in the past, that we remember the work always began and ended with the individual; the individual who created the cure, who did the work, and who didn’t lose hope. Never did they wash their hands and walk away from the crisis or turn their backs on the young; rather they recognized that the young are, in reality, the key to the stability and prosperity so sought after.

History: More than a Story

Broad based or narrow focused, history is not merely a collection of data, rather it is a story. At times, the story may seem dull, at other times captivating. The study of history can introduce us to the challenges and triumphs of the past. It can help us see patterns in the ‘action and reaction’ cycle of human relations.  It can help us learn from the past events which have paved the way for present actions. However, it can only teach us if we are willing to learn. Simply hearing the story is not enough. Regardless of how enthralling, action-packed, or awe-inspiring, history is not simply a story to be heard. It is a story to be understood.

Whether we look at the rise of Hitler, the arms race of the Cold War, or the growth of empire through colonialization, history can teach us about how groups of humans react when they feel threatened by other groups of humans. During the inter-war period in Germany, the people felt sorely abused by the rest of Europe. They sought a change and a savior from the economic oppression they felt was unjust. During the Cold War, citizens on both sides sought powerful military might as a means of protection from a threat often ideological more than physical. They didn’t simply want a powerful government, they wanted an all-powerful government that could protect them from phantoms as well as from armies. In both of these historical stories, if we take the time study them rather than simply hear them, we can learn that people are willing to give up basic human and civil rights in order to feel protected from outside threats. Additionally, if we go beyond the simple narrative often taught in history primers, we can see cases where people were easily persuaded to put aside their moral compass in order to achieve group affiliation and protection. While the story of Hitler and his atrocious reign of power might more easily provide examples of how people can become swayed by nationalism and nativism, the story of the Cold War also provides examples. Foreign relations, the relations between nations rather than individuals, often times reflect the very nature of human relations. Just as human and civil rights were often trampled upon in both the United States and the Soviet Union by their own respective citizenry, national sovereignty and the right to self-determination were often trampled upon by the superpowers as they spread their economic, political, and military influence. The notion that ‘might makes right’ was not constrained.

The notion of ‘might makes right’ is clearly depicted in the colonization period leading up to the twentieth century. Peoples who seemed to be less civilized in comparison to the social and political norms of Europe were to be suppressed and subjugated, or eradicated if they would not accept their place in the more ‘civilized’ society. Moral qualms were assuaged by dehumanizing those who did not fit the norm and who did not hold the power. This was not the first time the process of dehumanizing the ‘other’ for social or political gain occurred in history, but it did normalize it as culturally acceptable. Even as slavery lost support, colonial conquest and rule, including the westward expansion of the United States, reinforced the idea that certain peoples were more valuable than others. The mighty western nations viewed their culture to be better than the rest, and believed that forced assimilation was right and justified.

To the victor goes the spoils and also the chance to write the story, but history is more than just one person or nation’s account. It is a compilation of stories from many different perspectives. Like the heroic sagas of old, history can inspire and teach lessons to the listeners, but the study of history can do more. It can dispel notions that any one group of people is more perfect or more sinful than the others. It highlights the shared humanity of man; a humanity that is full of valor and full of vice.

Eisenhower: Popular Presidential Candidate

Political campaign season tends to encourage comparisons. Recently a journalist noted that Dwight D. Eisenhower had never held a public office prior to holding the highest public office of the United States. Eisenhower was a military man who had never voted for president, yet found himself asked by members from both political parties to run for president. In the end, and after much encouragement, Eisenhower chose to run for president with the Republican Party. His successful campaign, fueled by the slogan, “I like Ike,” was supported by a public hoping he would work to fix a broken national government.[1]

Eisenhower was a straight talking man who had honed his style and mannerisms during a lifetime of military service. Accepted into West Point in 1911, he began his service to his nation and committed himself to a life of duty and honor.[2] His experience as a leader grew during the three decades leading up to World War II and during his time as the Supreme Commander of the Allied Expeditionary Force. Military leadership at the level reached by Eisenhower required political skill and the ability to use diplomatic finesse. His experience as an able politician was refined when he was chosen to be the first Governor of the American Zone of Occupied Germany, and as he maneuvered through the political tensions that accompanied the position of Supreme Commander of Europe during the turbulent early period of the Cold War. While Eisenhower may not have engaged in domestic politics prior to running for the office of U.S. President, he was not unfamiliar with the skills and demeanor required of a U.S. president. The nation didn’t just “like Ike” but rather they loved him for what he represented and for the manner in which he conducted himself.

 

Endnotes

[1] “Dwight D. Eisenhower: Campaigns and Elections,” The Miller Center of Public Affairs, University of Virginia, http://millercenter.org/president/biography/eisenhower-campaigns-and-elections.

[2] “Dwight David Eisenhower.” US Army Center of Military History, http://www.history.army.mil/brochures/ike/ike.htm.

Further Material

The American Experience: The Presidents, Dwight D. Eisenhower – PBS

The American Experience: The Presidents, Dwight D. Eisenhower – Documentary Part 1 (YouTube)

The American Experience: The Presidents, Dwight D. Eisenhower – Documentary Part 2 (YouTube)

Service rather than Indiference

It could be said that Woodrow Wilson’s ideas are like a work of art. While the artist lived, the world was slow to embrace the art, but after the artist’s death, the world recognized the greatness of the work. Like with a work of art, interpretation would be highly subjective creating great potential for debate and disagreement.

In October 1916, Edward M. “Colonel” House, an American diplomat, stated, “We are part of the world…nothing that concerns the whole world can be indifferent to us.” During the same month, President Wilson stated that the United States would need to “serve the world.” [1] In order to provide this service, Wilson believed that a change in how international relations was conducted would be needed. It was vital that the old system of alliances be replaced by a new system of international cooperation.

Wilson was correct in the need for a new world order, and despite a growing isolationist movement in the United States, there would be no turning back from greater international political involvement. At the end of the Second World War, the United States played a dominant role in the international political body that was created to replace the failed League of Nations. While the United Nations would both be valued and criticized, it would, through accident or plan, become a way for nations to work together in war-torn regions of the world. Conflict and hostility might not have been eradicated through international cooperation, but service to the world’s population through peacekeeping efforts did, in some measure, fulfill the progressive ideas of the early twentieth-century. Certainly, it became harder for any powerful nation to remain indifferent to the concerns of the world.

[1] George C. Herring, From Colony to Superpower: U.S. Foreign Relations Since 1776 (New York: Oxford University Press, 2008), 407.

When Chaos Threatens, Diplomacy Struggles

Chaos breeds fear much like an insidious virus; everyone becomes fearful that they will be next to catch it. Segregation is then seen as a positive means of prevention; a measure taken before eradication can commence. Calls for calm and cooperation become drown out by vitriolic shouts for action. It seems that when chaos threatens human cooperation, tact and finesse are the first casualties. Within the world of international cooperation, chaos creates a force against which diplomacy struggles to survive. By the end of World War II, chaos had taken a terrible toll on humanity. Devastating war, multiple pandemics, and a severe economic depression all contributed to a general fatigue which left many seeking strong leadership rather than diplomatic dialogue. The rise of authoritarian leadership should not have surprised many, nor should there have been surprise that some desired isolation. Like in the case of the insidious virus, many felt that segregation from the problem was the logical solution. Others placed their faith in military strength and vitriolic rhetoric. World War II demonstrated that neither segregation nor authoritarian leadership would stop chaos. A terrible truth became evident; the world was too interconnected to ever truly support isolationist policies or prevent the chaos which can derive from authoritarian regimes. However, even as the interconnectedness of the world became an undisputed fact and the vital role of international diplomacy became apparent to those who had once questioned its value, the chaos of a post-WWII world threatened the very cooperation that had brought the war to an end.

World War II had ceased but the suffering caused by war had not. Additionally, the process of decolonization was creating renewed competition for areas of the world which had previously been controlled by foreign powers. A post-colonial world was ripe for chaos, particularly political chaos. The great powers of the day did not wish to see the return of any form of chaos, particularly chaos located in their own back yard. While, the Cold War has been characterized as a war between ideologies, it can also be viewed as a war to eradicate regional chaos. The United States and the Soviet Union both developed international policies which were authoritarian in nature. The nations of the world felt distinct pressure to choose a side. Traditional diplomacy suffered even as the United Nations worked to promote peace through diplomatic means. At the end of the day, pressure in the forms of military posturing and economic support or sanction often dictated international relations more than traditional diplomacy. For nearly fifty years, the United States and Soviet Union managed to keep the chaos from spreading within their own borders. Like with a virus, small outbreaks were to be expected, but the big pandemic was avoided. If chaos was a virus, then the Cold War cure was death to the host if segregation was ineffective. Diplomacy might seem a slow and imperfect treatment for the conflicts that threaten to unleash chaos, but is there truly wisdom in containing chaos through the threat or creation of greater chaos? Some will argue yes while others shudder no, but both should agree that when chaos threatens, diplomacy struggles.

 

Further Reading:

Herring, George C. From Colony to Superpower: U.S. Foreign Relations Since 1776. New York: Oxford University Press, 2008.

Lind, Michael. The American Way of Strategy: U.S. Foreign Policy and the American Way of Life. Oxford University Press, USA, 2006.

Weigley, Russell F. The American Way of War; a History of United States Military Strategy and Policy. New York: Macmillan Publishing Company, 1973.

Zubok, Vladislav M. A Failed Empire: The Soviet Union in the Cold War from Stalin to Gorbachev. Chapel Hill, NC: The University of North Carolina Press, 2009.

Home Production and National Defense

One hundred years ago, malnutrition was a problem that worried a nation facing war. Industrialization and urban growth had moved large populations into congested cities and away from rural communities. Both World War I and World War II would see an increase in the urbanization of the United States. The progressive reformers of the early twentieth century recognized that urbanization was leading to an unhealthy population and pushed for reform. They also pushed for vocational education, particularly in the area of what would become known as Home Economics.

One of the great misconceptions of the modern age is that the skills of the preindustrial age were easily passed from generation to generation, and that it is only modern society that struggles with the problems associated with the loss of these skills. Unlike the dissemination of information, knowledge is gained through practice. Skilled crafts and vocations require practice and often a good deal of instruction by a skilled guide. Remove proper training, and the skills are not learned and society struggles. In particular, modern society struggles with issues malnutrition and, more recently, obesity, both of which can be directly linked to a lack of basic knowledge of nutritional food consumption. It could also be argued that the conveniences of modern food production lend to the problems, especially when the issue of ‘prepared’ foods is under discussion. Despite the flood of DIY programs and videos demonstrating cooking and gardening techniques, home production and preparation of food is not as common as needed for a healthy society.

New technology in the early 1900s brought advancements in home food production and storage, but the skills needed to safely process food had to be learned. During the WWI, home canning and food storage was demonstrated and encouraged by divisions of local government and subsidized by the U.S. Department of Agriculture.[1] The Smith-Lever Act and the Smith-Hughes Act are two acts which provided funding for increased training in food production and domestic skills.

According to historian Ruth Schwartz Cowan, the “decade between the end of World War I and the beginning of the depression witnessed the most drastic changes in patterns of household work.”[2] Industrialization was changing the way work was managed, not just in the factories, but also in the homes. Industrialization increased the availability of commodities, many which made household work less time consuming and arduous. Convenience is usually a commodity appreciated, especially by those tasked with managing a household and feeling the pressures of working outside the home. However, the skills that had been learned before convenient options became available were not always passed down to the next generation.  Much like the youth of today, youth of past generations seldom liked learning to do things the old-fashioned way, especially not when new technology and innovation were changing the world. In order to offset the trend and ensure a healthier society, young women in private and public schools were taught the skills that many today assume would have been handed down from mother to daughter. Books titled, Clothing and Health, Shelter and Clothing, Foods and Household Management, and Household Arts for Home and School were produced and marketed to U.S. high schools. In the words of one author, “The authors feel that household arts in high schools should not be confined to problems in cooking and sewing. They are only a part of the study of home making.” In the 1915 edition of Shelter and Clothing, an entire chapter is dedicated to “the water supply and disposal of waste,” and included diagrams of the modern flushable toilet. Technology had changed the lives of many, but progressive minds of the age could see how new technology had to be integrated in to society through education rather than simply leaving society to work through the changes without assistance. World War I, the Great Depression, and World War II jolted policy makers into action. By the mid-1950s, Home Economics, as a high school subject, was accepted as an integral part of keeping the nation healthy and ready for future war. Even as warfare became more mechanized, the nation still held on to a belief that a healthy society was a strong society, and many school systems encouraged both male and female participation in Home Economics during the early 1980s. Unfortunately, the Technological Revolution of the 1990s and 2000s shifted the mindset of many, and like the industrial revolutions of the past, this latest revolution has supplanted convenience over skill. While information is just a click away, the knowledge that comes from skilled instruction is often harder to obtain, placing the nation at risk once more.

 

 

Endnotes

[1] Emily Newell Blair , and United States Council of National Defense. The Woman’s Committee: United States Council of National Defense, An Interpretative Report. April 21, 1917, to February 27, 1919,  e-book  (U.S. Government Printing Office, 1920).

[2] Ruth Schwartz Cowan, “The ‘Industrial Revolution’ in the Home: Household Technology and Social Change in the 20th Century,” Technology and Culture 17, no. 1 (1976): 1–23.

 

Rumors and Rhetoric

In 1783 at the army camp located in Newburgh, New York rumors of revolt were quelled when General George Washington addressed his men. The rhetoric, which had grown from frustration with Congress over back pay, was effectively countered when Washington spoke, “…let me entreat you, Gentlemen, on your part, not to take any measures, which, viewed in the calm light of reason, will lessen the dignity, and sully the glory you have hitherto maintained…”[1] Scholars have argued over whether the crisis in Newburgh was one of rhetoric only, or if an actual conspiracy existed which threatened the stability and future of the United States.[2] Regardless, the Newburgh Affair highlights how political rhetoric can lead to crisis, and how calm leadership rather than dramatic action can be the solution.

Conspiracy theorists and politically motivated historians have inferred that orchestrated nationalist machinations were the cause of the rumors and implied threats that swirled around Newburgh in the fall and winter of 1782-83. Others argue that frustration at the lack of pay, and the worry of a post-conflict future, organically inspired the rhetoric Washington felt needed addressed on March 15, 1783. Pamphlets, newspapers, public meetings, and personal correspondence were the main vehicles for the spreading of news and the airing of grievances prior to the technological age. The years leading up to the outbreak of war proved that these were effective tools in rousing public opinion in order to force change. It stood to reason then that these same tools would be used when Congress ground to a standstill on the issue of military pay and veteran benefits.

Even in the days before technology transformed the ways in which the world communicated, rumors once started were difficult to suppress. Enflamed rhetoric was even harder to manage for often it was printed and preserved for posterity. Fortunately for the young republic, General Washington was a man who had learned that brash language and rash actions were counter-productive to stability and prosperity. While he understood the frustration[3] of his men, he also understood that a liberty so newly achieved could not withstand civil discord.[4] A nation built from the fire of revolution would have to learn how to handle and even embrace civil discord.; however, Washington was wise in objecting to discord created by “insidious design” and spread by rumor and extreme rhetoric.

 

Endnotes

[1] George Washington, George Washington: Writings, vol. 91, Library of America (New York: Library of America, 1997), 499.

[2] Edward C. Skeen, and Richard H. Kohn, “The Newburgh Conspiracy Reconsidered,” The William and Mary Quarterly 31, no. 2 (1974): 273–298.

[3] Mary Stockwell, “Newburgh Address,” George Washington’s Mount Vernon, http://www.mountvernon.org/research-collections/digital-encyclopedia/article/newburgh-address/.

[4] Washington, 500.

Military Superiority Leads to Decline

As the twentieth century ended and the specter of the Cold War appeared to be fading into history, political scientists pondered the question of how a new world order would take shape under the direction of a victorious superpower. As John Ikenberry stated, victors try “to find ways to set limits on their powers and make it acceptable to other states.”[1] The United States, having spent a century building its image as military power determined to protect the world from evil and in doing so spread democracy, found itself in a dilemma. While talking heads and braggarts proclaimed U.S. superpower greatness, diplomats faced the harsh reality that yesterday’s protector can quickly become today’s bully and tomorrow’s enemy. Additionally, the economic strain military spending places on a society can become politically detrimental once victory occurs. In the past it was said that to the victor goes the spoils, but in modern times with plundering being frowned upon, the victor tends to win a headache both at home and abroad without seeing any real benefit. Without change in policy, particularly policy pertaining to its military superiority and status, a victorious nation discovers that military superiority can lead to economic and political decline.

Of the many headaches the United States developed as a single superpower in the years following the end of the Cold War, probably the most contentious one was the headache of being asked to intervene in conflicts great and small. Seldom was there a clear right side and wrong side to support. In many cases the crises that prompted the debate over intervention occurred in regions that had been previously under the political, economic, and military supervision of the Soviet Union. Even when using the umbrella of the United Nations, U.S. intervention could stir conflicting emotions in the crisis region. The United States had been both the enemy and possessor of enviable commodities for fifty years. Envy and distrust were not feelings easily eradicated simply because war was over. In a world that seemed to be rupturing in the absence of Cold War superpower dominance, the United States struggled with its expanded role of policeman, banker, and social worker.

Military dominance, which had spurred the U.S. economy in the years following World War II, became a burden following the end of the Cold War. In the wake of international cooperation and the perception of peace, nations could shift away from military technology as a basis of economic growth. Nations which remained entrenched in military development became economically dependent on wars that seldom required Cold War technology. Furthermore, Cold War technology had been all about fighting a war from a distance, and the conflicts of the twenty-first century required boots on the ground. When President Truman and President Eisenhower put their support behind the development of nuclear technology and behind the technology to deliver nuclear weapons from a distance, part of their justification was that it would save U.S. casualty and hypothetically shorten, if not prevent war. Their reasoning was based predominantly on the notion that nations would fight nations, and that the days of tribal warfare were becoming part of the past. When the theories and perception of modern war shifted after the attacks on the United States in 2001, the world powers seemed taken by surprise. When the Second Gulf War did not produce the results predicted, when peace did not flourish, and when terrorism spread rather than diminished, the United States seemed not only surprised but confused. The U.S. war strategy and military development, so honed during the twentieth century, did not work in the twenty-first. A nation which had grown powerful through military superiority, found itself the targeted enemy rather than the celebrated hero. Furthermore, it found itself struggling to justify increasing national debt, made larger due to wars that seemed to have no end. Like many great powers which had come before, the United States faced decline despite continued military superiority. In fact, it could be argued, the United States faced decline because of its military superiority.

 

 

Endotes

[1] John G. Ikenberry, After Victory: Institutions, Strategic Restraint, and the Rebuilding of Order after Major Wars (Princeton: Princeton University Press, 2001), xi.

 

Further Reading

Hixson, Walter L. The Myth of American Diplomacy: National Identity and U.S. Foreign Policy. New Haven, CT: Yale University Press, 2008.

Kennedy, Paul. The Rise and Fall of the Great Powers. New York: Vintage Books, 1989.

Obligated to Intervene

In 1820, the Congress of Troppau was convened. The great powers of the day determined that they held the right to intervene in the revolutionary conflicts of neighboring states. Maintaining the status quo and preventing the spread of nationalism and revolution was viewed as vital in the quest to quell the type of conflict that had erupted in Europe during the French Revolution and the Napoleonic Era. While the beginning of the century had been fraught with what some called the first worldwide war, the remainder of the century saw only regional conflicts, most that were harshly quelled before they could spread outside their borders. However the policy of intervention did not quell nationalism. During the twentieth century nationalism would be at the heart of many conflicts, and the notion that great nations had the right to intervene to protect the status quo would be at the center of international policy for many nations including the United States.

In the case of the United States, intervention became a tool to either protect or disrupt the status quo in a region depending on which was most beneficial to interests of the United States. Intervention often placed the nation at odds with its own revolutionary history and patriotic rhetoric. Despite seeming hypocritical in nature, the United States was not forging new diplomatic patterns but rather following the patterns established by the great powers of the past. The U.S. Founding Fathers may have wanted to distance themselves from the politics and practices of Europe, but their decedents embraced the policies as the United States rose to international supremacy during the twentieth century.

During the rise to superpower status, the United States benefited economically and politically. The right to intervene allowed the United States to protect economic markets, and in some cases add new markets and resources to its growing stock pile. While the nation doggedly denied that it was an empire, by the end of the twentieth century the problems associated with empires began to plague the nation. Most prominently, it could be argued, the United States faced the growing international expectation that it would intervene when conflict threatened a region’s status quo. After a century of gaining prominence and wealth through international intervention, often with the sole goal of protecting resources and markets, the United States found that the right to intervene had transformed into an obligation to intervene.

Historiography: How the Story is Written

In the modern world of minute-by-minute news coverage, it is easy to assume that history is being recorded both comprehensively and accurately. One may even think that the role of the historian is passé and all that is needed for the modern world is the analyst who will try to make sense out of current events. Even in a world where the events of the day are documented, and where social media can turn our most mundane activities into a historical sketch that we can share with all of our cyber friends, the role of the historian is crucial. It may even be more crucial than ever before because of the sheer volume of data that must now be shifted through in order to create a comprehensive, yet pertinent, story.

Accuracy in historical record has always been important to historians, but it has not been nearly as important as the story. In the days in which history was borrowed from others in order to bolster a rising nation’s image, accuracy was often less important than fostering the image that a new empire was ancient and eternal in origin. A good example of this is found with the Roman Empire, which having risen in power desired an historical record that would magnify its greatness rather than highlight its youth. Throughout history, political entities as well as powerful individuals have sought to bolster their images by creating histories that connect them to other prestigious historical events, periods, and reigns. By likening themselves to others who were dynamic, successful, dominant, and strong, they create an image of grandeur that is only partially based on the events of their own time and of their own making.

As technology and the availability of the written record evolved over the centuries, it became harder to use the history of others as a means in which one’s own history could be created. Even before the printing press, some historians began comparing their own region’s historical journey with that of their neighbors. In some cases, as with the historian Tacitus, the neighbor was heralded for its purity and simplicity in contrast to the corruption at home. In other cases, the neighbor was villainized in an attempt to deflect attention away from unpopular home-state policy. In either situation, the history of others was borrowed, no longer as a means to explain where a political state had come from, but rather to explain how the home-state compared to others. This trend created an interesting phenomenon in the writing of history. No longer was it simply good enough to extoll the greatness of one’s own past, but now it was acceptable, and even expected to criticize the neighbor as a means of exhausting oneself. By making the neighbor seem less noble or even villainous, the historian could create an even more illustrious history of the home-state.

In the not so distant past, historians were at the whim and will of powerful men who could finance the historical pursuit of the scholar. Modern technology has changed this to some extent. Scholarly history may still be contingent on research grants made possible by powerful institutions and individuals, but technology has made everyone who uses the internet a historian, or at least a historical participant. No longer is it only the famous, powerful, or well-connected who get recorded. Some individuals may only be contributors of data, whereas others may add more significantly to the record of daily events. In this world of high speed technology and vast data collection, history is being recorded more thoroughly than ever before, but that doesn’t mean that the record is any more accurate. Often, history is being recorded in very inaccurate ways and by people with little to no understanding of the ramifications this has on both the people of today as well as the historians of tomorrow. In the race to beat the ‘other guys’ to the best story, accuracy, once again, is secondary to the story being told.

Modern historians bound by ethical parameters of historical accuracy, try to share a story that is comprehensive, and as unbiased as possible. They are taught to question sources and present a full picture of an event, a person, or a period of time. In some cases, they are even taught to use good writing skills in order to make the story enjoyable to read. They are taught to recognize that history is not always pleasant, but it can always be of value, if even to only a few. At times, history can be a story of valor, bravery, and patriotic glory. At other times, history can be just the opposite. The modern historian may write a tale that makes some readers uncomfortable, but the job of the historian is to write a comprehensive and pertinent story rather than the myths and propaganda so many others are determined to write.