Category Archives: Current Affairs

Home Production and National Defense

One hundred years ago, malnutrition was a problem that worried a nation facing war. Industrialization and urban growth had moved large populations into congested cities and away from rural communities. Both World War I and World War II would see an increase in the urbanization of the United States. The progressive reformers of the early twentieth century recognized that urbanization was leading to an unhealthy population and pushed for reform. They also pushed for vocational education, particularly in the area of what would become known as Home Economics.

One of the great misconceptions of the modern age is that the skills of the preindustrial age were easily passed from generation to generation, and that it is only modern society that struggles with the problems associated with the loss of these skills. Unlike the dissemination of information, knowledge is gained through practice. Skilled crafts and vocations require practice and often a good deal of instruction by a skilled guide. Remove proper training, and the skills are not learned and society struggles. In particular, modern society struggles with issues malnutrition and, more recently, obesity, both of which can be directly linked to a lack of basic knowledge of nutritional food consumption. It could also be argued that the conveniences of modern food production lend to the problems, especially when the issue of ‘prepared’ foods is under discussion. Despite the flood of DIY programs and videos demonstrating cooking and gardening techniques, home production and preparation of food is not as common as needed for a healthy society.

New technology in the early 1900s brought advancements in home food production and storage, but the skills needed to safely process food had to be learned. During the WWI, home canning and food storage was demonstrated and encouraged by divisions of local government and subsidized by the U.S. Department of Agriculture.[1] The Smith-Lever Act and the Smith-Hughes Act are two acts which provided funding for increased training in food production and domestic skills.

According to historian Ruth Schwartz Cowan, the “decade between the end of World War I and the beginning of the depression witnessed the most drastic changes in patterns of household work.”[2] Industrialization was changing the way work was managed, not just in the factories, but also in the homes. Industrialization increased the availability of commodities, many which made household work less time consuming and arduous. Convenience is usually a commodity appreciated, especially by those tasked with managing a household and feeling the pressures of working outside the home. However, the skills that had been learned before convenient options became available were not always passed down to the next generation.  Much like the youth of today, youth of past generations seldom liked learning to do things the old-fashioned way, especially not when new technology and innovation were changing the world. In order to offset the trend and ensure a healthier society, young women in private and public schools were taught the skills that many today assume would have been handed down from mother to daughter. Books titled, Clothing and Health, Shelter and Clothing, Foods and Household Management, and Household Arts for Home and School were produced and marketed to U.S. high schools. In the words of one author, “The authors feel that household arts in high schools should not be confined to problems in cooking and sewing. They are only a part of the study of home making.” In the 1915 edition of Shelter and Clothing, an entire chapter is dedicated to “the water supply and disposal of waste,” and included diagrams of the modern flushable toilet. Technology had changed the lives of many, but progressive minds of the age could see how new technology had to be integrated in to society through education rather than simply leaving society to work through the changes without assistance. World War I, the Great Depression, and World War II jolted policy makers into action. By the mid-1950s, Home Economics, as a high school subject, was accepted as an integral part of keeping the nation healthy and ready for future war. Even as warfare became more mechanized, the nation still held on to a belief that a healthy society was a strong society, and many school systems encouraged both male and female participation in Home Economics during the early 1980s. Unfortunately, the Technological Revolution of the 1990s and 2000s shifted the mindset of many, and like the industrial revolutions of the past, this latest revolution has supplanted convenience over skill. While information is just a click away, the knowledge that comes from skilled instruction is often harder to obtain, placing the nation at risk once more.

 

 

Endnotes

[1] Emily Newell Blair , and United States Council of National Defense. The Woman’s Committee: United States Council of National Defense, An Interpretative Report. April 21, 1917, to February 27, 1919,  e-book  (U.S. Government Printing Office, 1920).

[2] Ruth Schwartz Cowan, “The ‘Industrial Revolution’ in the Home: Household Technology and Social Change in the 20th Century,” Technology and Culture 17, no. 1 (1976): 1–23.

 

Rumors and Rhetoric

In 1783 at the army camp located in Newburgh, New York rumors of revolt were quelled when General George Washington addressed his men. The rhetoric, which had grown from frustration with Congress over back pay, was effectively countered when Washington spoke, “…let me entreat you, Gentlemen, on your part, not to take any measures, which, viewed in the calm light of reason, will lessen the dignity, and sully the glory you have hitherto maintained…”[1] Scholars have argued over whether the crisis in Newburgh was one of rhetoric only, or if an actual conspiracy existed which threatened the stability and future of the United States.[2] Regardless, the Newburgh Affair highlights how political rhetoric can lead to crisis, and how calm leadership rather than dramatic action can be the solution.

Conspiracy theorists and politically motivated historians have inferred that orchestrated nationalist machinations were the cause of the rumors and implied threats that swirled around Newburgh in the fall and winter of 1782-83. Others argue that frustration at the lack of pay, and the worry of a post-conflict future, organically inspired the rhetoric Washington felt needed addressed on March 15, 1783. Pamphlets, newspapers, public meetings, and personal correspondence were the main vehicles for the spreading of news and the airing of grievances prior to the technological age. The years leading up to the outbreak of war proved that these were effective tools in rousing public opinion in order to force change. It stood to reason then that these same tools would be used when Congress ground to a standstill on the issue of military pay and veteran benefits.

Even in the days before technology transformed the ways in which the world communicated, rumors once started were difficult to suppress. Enflamed rhetoric was even harder to manage for often it was printed and preserved for posterity. Fortunately for the young republic, General Washington was a man who had learned that brash language and rash actions were counter-productive to stability and prosperity. While he understood the frustration[3] of his men, he also understood that a liberty so newly achieved could not withstand civil discord.[4] A nation built from the fire of revolution would have to learn how to handle and even embrace civil discord.; however, Washington was wise in objecting to discord created by “insidious design” and spread by rumor and extreme rhetoric.

 

Endnotes

[1] George Washington, George Washington: Writings, vol. 91, Library of America (New York: Library of America, 1997), 499.

[2] Edward C. Skeen, and Richard H. Kohn, “The Newburgh Conspiracy Reconsidered,” The William and Mary Quarterly 31, no. 2 (1974): 273–298.

[3] Mary Stockwell, “Newburgh Address,” George Washington’s Mount Vernon, http://www.mountvernon.org/research-collections/digital-encyclopedia/article/newburgh-address/.

[4] Washington, 500.

Military Superiority Leads to Decline

As the twentieth century ended and the specter of the Cold War appeared to be fading into history, political scientists pondered the question of how a new world order would take shape under the direction of a victorious superpower. As John Ikenberry stated, victors try “to find ways to set limits on their powers and make it acceptable to other states.”[1] The United States, having spent a century building its image as military power determined to protect the world from evil and in doing so spread democracy, found itself in a dilemma. While talking heads and braggarts proclaimed U.S. superpower greatness, diplomats faced the harsh reality that yesterday’s protector can quickly become today’s bully and tomorrow’s enemy. Additionally, the economic strain military spending places on a society can become politically detrimental once victory occurs. In the past it was said that to the victor goes the spoils, but in modern times with plundering being frowned upon, the victor tends to win a headache both at home and abroad without seeing any real benefit. Without change in policy, particularly policy pertaining to its military superiority and status, a victorious nation discovers that military superiority can lead to economic and political decline.

Of the many headaches the United States developed as a single superpower in the years following the end of the Cold War, probably the most contentious one was the headache of being asked to intervene in conflicts great and small. Seldom was there a clear right side and wrong side to support. In many cases the crises that prompted the debate over intervention occurred in regions that had been previously under the political, economic, and military supervision of the Soviet Union. Even when using the umbrella of the United Nations, U.S. intervention could stir conflicting emotions in the crisis region. The United States had been both the enemy and possessor of enviable commodities for fifty years. Envy and distrust were not feelings easily eradicated simply because war was over. In a world that seemed to be rupturing in the absence of Cold War superpower dominance, the United States struggled with its expanded role of policeman, banker, and social worker.

Military dominance, which had spurred the U.S. economy in the years following World War II, became a burden following the end of the Cold War. In the wake of international cooperation and the perception of peace, nations could shift away from military technology as a basis of economic growth. Nations which remained entrenched in military development became economically dependent on wars that seldom required Cold War technology. Furthermore, Cold War technology had been all about fighting a war from a distance, and the conflicts of the twenty-first century required boots on the ground. When President Truman and President Eisenhower put their support behind the development of nuclear technology and behind the technology to deliver nuclear weapons from a distance, part of their justification was that it would save U.S. casualty and hypothetically shorten, if not prevent war. Their reasoning was based predominantly on the notion that nations would fight nations, and that the days of tribal warfare were becoming part of the past. When the theories and perception of modern war shifted after the attacks on the United States in 2001, the world powers seemed taken by surprise. When the Second Gulf War did not produce the results predicted, when peace did not flourish, and when terrorism spread rather than diminished, the United States seemed not only surprised but confused. The U.S. war strategy and military development, so honed during the twentieth century, did not work in the twenty-first. A nation which had grown powerful through military superiority, found itself the targeted enemy rather than the celebrated hero. Furthermore, it found itself struggling to justify increasing national debt, made larger due to wars that seemed to have no end. Like many great powers which had come before, the United States faced decline despite continued military superiority. In fact, it could be argued, the United States faced decline because of its military superiority.

 

 

Endotes

[1] John G. Ikenberry, After Victory: Institutions, Strategic Restraint, and the Rebuilding of Order after Major Wars (Princeton: Princeton University Press, 2001), xi.

 

Further Reading

Hixson, Walter L. The Myth of American Diplomacy: National Identity and U.S. Foreign Policy. New Haven, CT: Yale University Press, 2008.

Kennedy, Paul. The Rise and Fall of the Great Powers. New York: Vintage Books, 1989.

Obligated to Intervene

In 1820, the Congress of Troppau was convened. The great powers of the day determined that they held the right to intervene in the revolutionary conflicts of neighboring states. Maintaining the status quo and preventing the spread of nationalism and revolution was viewed as vital in the quest to quell the type of conflict that had erupted in Europe during the French Revolution and the Napoleonic Era. While the beginning of the century had been fraught with what some called the first worldwide war, the remainder of the century saw only regional conflicts, most that were harshly quelled before they could spread outside their borders. However the policy of intervention did not quell nationalism. During the twentieth century nationalism would be at the heart of many conflicts, and the notion that great nations had the right to intervene to protect the status quo would be at the center of international policy for many nations including the United States.

In the case of the United States, intervention became a tool to either protect or disrupt the status quo in a region depending on which was most beneficial to interests of the United States. Intervention often placed the nation at odds with its own revolutionary history and patriotic rhetoric. Despite seeming hypocritical in nature, the United States was not forging new diplomatic patterns but rather following the patterns established by the great powers of the past. The U.S. Founding Fathers may have wanted to distance themselves from the politics and practices of Europe, but their decedents embraced the policies as the United States rose to international supremacy during the twentieth century.

During the rise to superpower status, the United States benefited economically and politically. The right to intervene allowed the United States to protect economic markets, and in some cases add new markets and resources to its growing stock pile. While the nation doggedly denied that it was an empire, by the end of the twentieth century the problems associated with empires began to plague the nation. Most prominently, it could be argued, the United States faced the growing international expectation that it would intervene when conflict threatened a region’s status quo. After a century of gaining prominence and wealth through international intervention, often with the sole goal of protecting resources and markets, the United States found that the right to intervene had transformed into an obligation to intervene.

Historiography: How the Story is Written

In the modern world of minute-by-minute news coverage, it is easy to assume that history is being recorded both comprehensively and accurately. One may even think that the role of the historian is passé and all that is needed for the modern world is the analyst who will try to make sense out of current events. Even in a world where the events of the day are documented, and where social media can turn our most mundane activities into a historical sketch that we can share with all of our cyber friends, the role of the historian is crucial. It may even be more crucial than ever before because of the sheer volume of data that must now be shifted through in order to create a comprehensive, yet pertinent, story.

Accuracy in historical record has always been important to historians, but it has not been nearly as important as the story. In the days in which history was borrowed from others in order to bolster a rising nation’s image, accuracy was often less important than fostering the image that a new empire was ancient and eternal in origin. A good example of this is found with the Roman Empire, which having risen in power desired an historical record that would magnify its greatness rather than highlight its youth. Throughout history, political entities as well as powerful individuals have sought to bolster their images by creating histories that connect them to other prestigious historical events, periods, and reigns. By likening themselves to others who were dynamic, successful, dominant, and strong, they create an image of grandeur that is only partially based on the events of their own time and of their own making.

As technology and the availability of the written record evolved over the centuries, it became harder to use the history of others as a means in which one’s own history could be created. Even before the printing press, some historians began comparing their own region’s historical journey with that of their neighbors. In some cases, as with the historian Tacitus, the neighbor was heralded for its purity and simplicity in contrast to the corruption at home. In other cases, the neighbor was villainized in an attempt to deflect attention away from unpopular home-state policy. In either situation, the history of others was borrowed, no longer as a means to explain where a political state had come from, but rather to explain how the home-state compared to others. This trend created an interesting phenomenon in the writing of history. No longer was it simply good enough to extoll the greatness of one’s own past, but now it was acceptable, and even expected to criticize the neighbor as a means of exhausting oneself. By making the neighbor seem less noble or even villainous, the historian could create an even more illustrious history of the home-state.

In the not so distant past, historians were at the whim and will of powerful men who could finance the historical pursuit of the scholar. Modern technology has changed this to some extent. Scholarly history may still be contingent on research grants made possible by powerful institutions and individuals, but technology has made everyone who uses the internet a historian, or at least a historical participant. No longer is it only the famous, powerful, or well-connected who get recorded. Some individuals may only be contributors of data, whereas others may add more significantly to the record of daily events. In this world of high speed technology and vast data collection, history is being recorded more thoroughly than ever before, but that doesn’t mean that the record is any more accurate. Often, history is being recorded in very inaccurate ways and by people with little to no understanding of the ramifications this has on both the people of today as well as the historians of tomorrow. In the race to beat the ‘other guys’ to the best story, accuracy, once again, is secondary to the story being told.

Modern historians bound by ethical parameters of historical accuracy, try to share a story that is comprehensive, and as unbiased as possible. They are taught to question sources and present a full picture of an event, a person, or a period of time. In some cases, they are even taught to use good writing skills in order to make the story enjoyable to read. They are taught to recognize that history is not always pleasant, but it can always be of value, if even to only a few. At times, history can be a story of valor, bravery, and patriotic glory. At other times, history can be just the opposite. The modern historian may write a tale that makes some readers uncomfortable, but the job of the historian is to write a comprehensive and pertinent story rather than the myths and propaganda so many others are determined to write.