Tag Archives: WWI

Home Production and National Defense

One hundred years ago, malnutrition was a problem that worried a nation facing war. Industrialization and urban growth had moved large populations into congested cities and away from rural communities. Both World War I and World War II would see an increase in the urbanization of the United States. The progressive reformers of the early twentieth century recognized that urbanization was leading to an unhealthy population and pushed for reform. They also pushed for vocational education, particularly in the area of what would become known as Home Economics.

One of the great misconceptions of the modern age is that the skills of the preindustrial age were easily passed from generation to generation, and that it is only modern society that struggles with the problems associated with the loss of these skills. Unlike the dissemination of information, knowledge is gained through practice. Skilled crafts and vocations require practice and often a good deal of instruction by a skilled guide. Remove proper training, and the skills are not learned and society struggles. In particular, modern society struggles with issues malnutrition and, more recently, obesity, both of which can be directly linked to a lack of basic knowledge of nutritional food consumption. It could also be argued that the conveniences of modern food production lend to the problems, especially when the issue of ‘prepared’ foods is under discussion. Despite the flood of DIY programs and videos demonstrating cooking and gardening techniques, home production and preparation of food is not as common as needed for a healthy society.

New technology in the early 1900s brought advancements in home food production and storage, but the skills needed to safely process food had to be learned. During the WWI, home canning and food storage was demonstrated and encouraged by divisions of local government and subsidized by the U.S. Department of Agriculture.[1] The Smith-Lever Act and the Smith-Hughes Act are two acts which provided funding for increased training in food production and domestic skills.

According to historian Ruth Schwartz Cowan, the “decade between the end of World War I and the beginning of the depression witnessed the most drastic changes in patterns of household work.”[2] Industrialization was changing the way work was managed, not just in the factories, but also in the homes. Industrialization increased the availability of commodities, many which made household work less time consuming and arduous. Convenience is usually a commodity appreciated, especially by those tasked with managing a household and feeling the pressures of working outside the home. However, the skills that had been learned before convenient options became available were not always passed down to the next generation.  Much like the youth of today, youth of past generations seldom liked learning to do things the old-fashioned way, especially not when new technology and innovation were changing the world. In order to offset the trend and ensure a healthier society, young women in private and public schools were taught the skills that many today assume would have been handed down from mother to daughter. Books titled, Clothing and Health, Shelter and Clothing, Foods and Household Management, and Household Arts for Home and School were produced and marketed to U.S. high schools. In the words of one author, “The authors feel that household arts in high schools should not be confined to problems in cooking and sewing. They are only a part of the study of home making.” In the 1915 edition of Shelter and Clothing, an entire chapter is dedicated to “the water supply and disposal of waste,” and included diagrams of the modern flushable toilet. Technology had changed the lives of many, but progressive minds of the age could see how new technology had to be integrated in to society through education rather than simply leaving society to work through the changes without assistance. World War I, the Great Depression, and World War II jolted policy makers into action. By the mid-1950s, Home Economics, as a high school subject, was accepted as an integral part of keeping the nation healthy and ready for future war. Even as warfare became more mechanized, the nation still held on to a belief that a healthy society was a strong society, and many school systems encouraged both male and female participation in Home Economics during the early 1980s. Unfortunately, the Technological Revolution of the 1990s and 2000s shifted the mindset of many, and like the industrial revolutions of the past, this latest revolution has supplanted convenience over skill. While information is just a click away, the knowledge that comes from skilled instruction is often harder to obtain, placing the nation at risk once more.

 

 

Endnotes

[1] Emily Newell Blair , and United States Council of National Defense. The Woman’s Committee: United States Council of National Defense, An Interpretative Report. April 21, 1917, to February 27, 1919,  e-book  (U.S. Government Printing Office, 1920).

[2] Ruth Schwartz Cowan, “The ‘Industrial Revolution’ in the Home: Household Technology and Social Change in the 20th Century,” Technology and Culture 17, no. 1 (1976): 1–23.

 

Change Came Quickly

In 1918, Fritz Haber was awarded the Nobel Prize in Chemistry. World War I delayed the presentation of the award because Haber was a German scientist, one who had gained the name ‘the father of chemical warfare’. Haber was a patriotic German committed to the German cause, however, less than fifteen years after he was celebrated as a great scientist, he fled his homeland fearing for his life. Fritz Haber was a Jew. He was also an intellectual who too closely associated with a war that had been lost rather than won. Like many other German citizens, Haber discovered that under the right set of circumstances hate could replace friendship with great rapidity. Those circumstances included an economic recession, a turbulent political climate, an abundance persuasive rhetoric, and a highly effective propaganda campaign. In less than two decades, a population who once celebrated Haber’s achievements turned their backs on the evidence that their government had implemented a policy of incarceration and extermination. Race, religious affiliation, sexual orientation, and intellectual interests were more than enough justification for the public to look the other way, or worse join the Nazi agenda. Change came quickly while the public clung to the notion that they were justified in their actions.

U.S. Compulsory Education: Teaching Exceptionalism

During the mid-nineteenth century, states began passing compulsory education laws, and although all states had these laws in place by the time the United States entered World War I, there was still quite a disparity between levels of basic education received by the soldiers. Mobilization efforts during WWI highlighted the need for greater emphasis on education in the United States, but it also highlighted the need to emphasize a common nationality among its citizenry. The war had created a stigma on citizens and immigrants who were too closely related or associated with the enemy. It was felt that the ‘old country’ culture, still held by many, needed to be replaced by a commitment to a less definable, but more patriotic American culture. The desire to eliminate overt connections with European culture, a culture that seemed to instigate war rather than peace, led to strong measures designed to force change in the U.S. population. One measure included the effort to eliminate parochial schools which were viewed as being too closely tied to European culture. When Oregon amended its compulsory education laws in 1922 with the intent to eliminate parochial schools, they faced opposition including a Supreme Court case that ended up ruling against them. It was hoped that public education would transform the population into a more cohesive culture, and while states couldn’t force public school attendance versus private school attendance, over time many states were able to dictate curriculum requirements and achieve the underlying goals sought by legislators during the post-war period.

Many in the United States believed that the nation had a vital responsibility to encourage and spread notions of republican democracy. A growing belief in ‘American exceptionalism’ developed in the post-war years, due in part to wartime propaganda. If the United States was to be exceptional then it needed to guarantee that its public understood what made it exceptional. Accomplishing this task meant that its citizenry needed to understand history, and not just the history of the United States beginning with colonization or independence, but a citizen needed to understand the connection between the United States and ancient history where the foundations of democracy resided. Compulsory education, classes in American History and Western Civilization, and an emphasis on U.S. exceptionalism became the foundation for unifying a nation during the twentieth century.

Going to War: Power and Prosperity

The United States presents a fascinating study of the various reasons a nation chooses, or feels forced to go to war. In the early days of the nation, war with foreign powers was seen as too entangling to enter into lightly. Attempts to circumvent armed retaliation for foreign oppression resulted in embargoes which hurt the U.S. more than it did those at whom the embargoes were aimed. Military retaliations seldom achieved the sought after goals, although they did establish the clear message that the young nation would not tolerate foreign oppression. International conflict was costly regardless of the strategy, but by the late 1800s a new reality was emerging within the power brokers of the nation. War, while costly in men and machine, could also provide an economic boost to a nation struggling with recession. This reality would become even more pronounced in the 1900s as the machine began to dominate warfare and a race to beat others in the field of war technology intensified. War had become a profitable business even as the world became terrified of the horrible human destruction modern war created. By the mid-1900s, war technology began to threaten the very existence of mankind even while the development of the technology made many powerful and wealthy.

Interestingly, in the early decades of the 1900s men like Woodrow Wilson were well aware of how devastating war could be on the humanity. Having been born in Virginia in the decade prior to the Civil War, Wilson’s earliest memories would have been of war, deprivation, and human suffering. He would have spent his youth seeing war veterans and hearing their stories. He went to school where he studied history and politics, subjects that would have exposed him to the many wars fought over power and possession. He earned a doctorate and would be the first U.S. president to have a PhD. The study of history and politics would have influenced his aversion to going to war, but his belief that the United States could influence others in a positive way would justify his support of intervention and eventually international war. Like many other intellectuals and politicians, his desire to spread the ideologies of democracy and capitalism, in other words, to help others become more like his beloved nation, blinded him to the fact that others might not wish to emulate the United States. As president, he was well on the way to becoming remembered for his military interventions and suppression of revolution before World War I thrust him into the role of international mediator.

Having come to age during the years when the United States attempted to heal from the wounds of war, and having seen firsthand the difficulties created in a society when harsh, punitive treatment was dealt to the defeated, it is not surprising that Wilson would wish to avoid repeating such mistakes when negotiating peace in Europe. It is also not surprising that Wilson would want to find a way to avoid future war. In the end, war is costly and a desire to recoup one’s own expenses at the further detriment of the defeated is hard to suppress. Furthermore, notions of international cooperation can, for many, seem to weaken a nation rather than propel it to greatness. Peace is virtuous, but war promotes power and economic vitality, especially if the war never touches the homeland.

For Further Reading

Boemeke, Manfred F., Gerald D. Feldman, and Elisabeth Glaser, eds. The Treaty of Versailles: A Reassessment after 75 Years. Cambridge University Press, 1998.

Herring, George C. From Colony to Superpower: U.S. Foreign Relations Since 1776. New York: Oxford University Press, 2008.

Unexpected Consequences: Revolution

Prior to the twentieth century, war was most often the product of the elite rather than the common man. Assuredly, war had an impact, both direct and indirect, on the laborer. Whether from conscription, taxation, or proximity to the combat and the combatants, war could wreak havoc. War could also quickly change boundaries and cause forced changes in allegiance. Entire regions could become disputed territory as powerful states weakened and weaker states grew strong. The chaos of the French Revolution and the Napoleonic Wars led the rulers of Europe to seek a balance of power that would prevent the outbreak of wide spread war. For approximately a century they succeeded in quelling the rising nationalistic zeal that threatened to reignite the flames of world war. However, revolutionary ideologies were not contained even as rulers tried to contain revolt. While notions of self-determination, democracy, and equality were discussed by liberal minded thinkers, the ruling class held fast to the notion that not all men were ready or capable of self-rule. In some cases, outright racism was the justification for the continuation of imperial dominance and all the ills that imperialism wrought on subjugated peoples. In other cases, benign paternalism justified policies that increased inequality and protected the status quo. Regardless of the grand rhetoric of the time that promoted equality and brotherhood, paternalistic elitism, the belief that some were better suited to govern than others, remained the consensus of the day.

As the twentieth century dawned, changes in society due to industrialization were creating unrest. The outbreak of World War I ratcheted up the change. Women went to work in greater numbers, particularly women who belonged to the middle class.  Men, who had once been viewed as expendable laborers, became a valuable commodity. Total warfare left no civilian untouched and caused soldiers to question the futility of war. As fighting dragged on and depravation increased, patriotic citizens on the battlefield and home front struggled to find justification for the continued support of a war that seemed less and less justifiable.

In Russia, the seeds of revolution found fertile ground as the citizens lost faith in an old system that seemed to bring endless suffering. Elsewhere the notions of liberty, self-determination, and equality caused subjugated peoples to question why they should remain the possessions of rulers in distant lands rather than be allowed to govern themselves. While Allied nations fought to prevent the invasion, subjugation, and annexation of small nations like Belgium and prevent territorial losses in France, the same nations clung fast to their territorial holdings in other regions of the world. The brutality and futility of total war also caused many within Europe to question whether the empires that governed them did so with any consideration for their needs and their security. Ethnic unrest, nationalistic zeal, and distrust for those with different cultural habits increased as the war continued. The seeds of revolution were cast wide, some to find fertile ground immediately and others to remain dormant for decades, but all to produce the fruit of conflict and bloodshed. Revolution was not the goal of those who declared war in 1914 but revolution was the unexpected consequence.

Home Front: A Culture of War

Patriotism and honor spurred the rise of War Fever during the First World War. Men signed up to fight and women proudly waved them good luck and goodbye. Yet as the war dragged on, War Fever waned. It was replaced by a culture of war which may not have eliminated the need for conscription but did place a stigma upon those who did not support the nation’s efforts to defeat the enemy. Censorship and propaganda aided in the development of a culture focused on patriotic valor, sacrifice, and mourning.

Governments felt a strong need to censor bad news. It took President Wilson less than a month after the declaration of war to create the Committee on Public Information (CPI) which was tasked with the censorship of telegraphs and radio stations. The CPI was not only tasked with restricting the negative news from reaching the public, but it was also tasked with promoting a war that seemed to have little direct impact on the United States as a whole. The United States was not alone in its need for the promotion of war when direct threat did not seem imminent. Great Britain had faced such a conundrum as well. Politicians and military leaders may have been convinced of the dangers the enemy posed to national interest and defense, but the common public had to be convinced. The United States faced an even greater challenge in convincing the public than the British had faced. Proximity to the enemy was not an arguable justification for the United States to go to war. Additionally, evidence of sabotage and espionage did not generate the kind of call to war that invasion and occupation did for nations like France. Propaganda needed to rely on inspiring the nation to develop a culture of war; not one based on immediate threat, but based upon the defense of ideology – the defense of liberty and democratic principles.

Fighting a war based upon the belief that the enemy did not seek territory but sought to destroy the virtuous fiber of the nation was accomplished by the demonizing of the enemy. In every war, demons would emerge, but it was not enough to demonize an individual or a military unit, the enemy’s society, as a whole, needed to be shown as demons seeking to exterminate all that a nation held dear. Over time, propaganda would help the public develop a hatred for an enemy who seemed to loath the virtues of liberty, freedom, and human dignity. In the name of these virtues, the home front embraced a culture of war even as they mourned the costs of war.

Off the Battlefield and Socks

An old photo depicting men and women knitting socks flashes before my mind’s eye. Young and old, men and women, the wounded. Knitting socks was a way to support the troops of World War I. Today a trip to Walmart can easily supply a package of cotton socks. Wool socks, sturdy and durable might take a bit more searching to find, but a visit to a good sporting goods store, especially one selling skiing supplies, will do the trick. The days when proper foot care required handmade socks are long gone, and with the passage of time the memory of the dedicated service provided by the sock makers has faded. It is estimated that sixty-five million men were mobilized to fight in WWI, and each soldier would have needed socks as he went to war, and then more socks to replace the ones worn out from long marches or damp trenches. On the home front, knitting campaigns called people to action. Idle hands at home meant soldiers on the battlefield would suffer.

The technological advancements of the early 1900s did not eliminate the need for handmade socks, and as the world entered a second war, the patriotic call again went out for more socks. However, technology had made war so much more destructive. The bombing campaigns of WWII left towns in rubble and displaced an estimated sixty million Europeans. When the war ended, the hardships of war did not. Basic essentials for survival were still in desperate need. The infrastructure destroyed by military campaigns had to be rebuilt before the suffering could end. Battlefields had to be cleared and communities reestablished. Unfortunately, the humanitarian efforts of busy hands and caring hearts ran into political roadblocks. Decimated nations could not process and deliver the goods effectively. A care package from a long-distant relative or a long-distance friend had an easier time getting through to a family in need than did the large scale aid from relief organizations.

By the end of the twentieth century, handmade socks were a novelty rather than a necessity, and nations had learned valuable lessons about both the effects of war on and off the battlefield, and the need for post-war recovery efforts to eliminate humanitarian crises once war had ceased. As the century ended, the severity of war had not necessarily diminished, but the percentage of the population directly affected by war had. War still displaced, disrupted, and decimated local populations, but seldom reached the distant homelands of the foreign nations providing military support for weak governments. Therefore, the patriotic call to serve those who sacrificed and suffered in the name of liberty, freedom, or national interest was easily drowned out by the pleasurable distractions of life in a homeland untouched by war. By the end of the twentieth century, war, much like homemade socks, was a novelty rather than a reality – something other people might do, but not something that had a place in the modern, fast-paced, safer world many were sure the new century would bring.

Humanity on the Battlefield

There is a popular story that goes around at Christmas time about soldiers all along the Western Front calling a truce and singing Silent Night on Christmas Eve. What is often left out of the story is the anger this show of humanity caused in the higher leadership. During war, a reminder that the enemy is not the monster which propaganda depicts can interfere with morale, and with a soldier’s determination to win at all costs. Yet on that Christmas Eve, men on opposing sides of a futile war remembered that only politics separated them. Christmas marked the fifth month of war and the third month in the trenches. World War I was still in its early days and there was still hope for victory and for the short war the generals and politicians on both sides had promised the soldiers. The peace which was hoped for on Christmas Eve 1914 would not be found until Christmas time 1918. The brutality of the war and the anger of generals would squelch attempts to repeat what had sprung up so naturally along the Western Front in 1914. However, the legend of the first Christmas of WWI would remind generations that in war humanity can survive.

National Security: the Value of Nutrition and Education

In the years leading up to World War I, many progressive thinkers began to campaign for social reform. The industrial revolution changed society in many ways, not all of which were good for the nation or for national security. Unskilled labor and skilled labor alike were susceptible to the ills of urban life. Just as the war in Europe was igniting, one group of progressive reformers was introducing home economics textbooks and coursework into schools. Proper hygiene and good nutrition began to be taught alongside other subjects. Malnutrition and disease were viewed as ills which not only weakened society but undermined national well-being. The reformers who pushed for better living conditions and education for urban families gained a powerful ally when the United States entered WWI. The ally was the U.S. Army. When faced with a modern war, modern both in weaponry and technologically, the U.S. Army quickly discovered that it was no longer beneficial to go to war with illiterate soldiers. Modern war demanded healthy soldiers and demanded that the soldiers could communicate efficiently with each other. Basic health and literacy became a necessity for the modern army. The ground gained in understanding this truth was not easily won. The soldiers who fought in the war learned firsthand the value of both a healthy body and the ability to communicate with their fellow soldiers. Having a common language coupled with the ability to read and write in it would be something the returning soldiers would seek for their own children. These veterans would push for change. By the end of World War II the realities of modern war mandated the necessity of having a nation populated with citizens possessing basic health and education. Education and proper nutrition became a matter of national security.

Additional Reading:

  • Keene, Jennifer D. Doughboys, the Great War and the Remaking of America. Baltimore: The Johns Hopkins University Press, 2001.
  • National Security Act of 1947, Title X.
  • There were various publications designed to introduce Home Economics in the schools. Some have been scanned and can be found in different e-book collections. Original copies can be found through used bookstores. My favorites were authored by Helen Kinne and Anna M. Cooley.

No Man’s Land

A term older than World War I but popularized during that war, no man’s land refers to a stretch of land under dispute by warring parties, but it can also refer to lawless areas with little or no governing control. A buffer zone, on the other hand, is an area which provides a sense of protection from the enemy. When physical fortifications offer little protection, buffer zones can provide a perception of security. Nations great and small seek the perception of security when security is elusive. Treaties and alliances are traditional means of creating a sense of security, as is the creation of buffer zones. During the Cold War, the competing nations sought to expand their spheres of influence, thereby creating buffer zones between themselves and their enemies as their spheres grew. When the Cold War ended and the buffer zones were no longer needed, many of the buffer nations found themselves with fewer friends and with fewer resources to prevent lawlessness. These nations found it difficult to avoid the development of no man’s land within their borders.

The United States reasoned, even in the earliest days, that oceans made excellent buffer zones against the conflicts of Europe. Unsettled territories were adequate as buffers but only to a point. While unsettled territories didn’t pose a direct European threat, they were still loosely under the influence of powerful countries. Additionally they often attracted outlaws fleeing justice and smugglers seeking a base of operation near their markets. In 1818, Andrew Jackson decided to pursue a group of raiders into Florida. The problem was that Florida was owned by Spain and Spain had little ability to prevent lawlessness in the territory. When Jackson’s army crossed into Florida, he invaded a foreign nation. Without the consent of Spain, such an action created an international incident. Fortunately Secretary of State John Q. Adams was able to capitalize on Jackson’s actions, and convinced Spain that a treaty was better than a war. His reasoning for defending Jackson’s violation of Spanish sovereignty was that “it is better to err on the side of vigor.”[1] Certainly not the first time a nation chose a declaration of strength as its response to an international crisis of its own making, but possibly the first time such a response became national policy. As Secretary of State, Adams greatly influenced the foreign policy decisions of the president and authored much of what President Monroe presented to Congress. In March 1818, President Monroe declared to Congress that when a nation no longer governed in such a way as to prevent their lawlessness from spilling onto their neighbors, then the neighbors had the right to protect themselves and to seek justice even if it meant violating the sovereignty of another nation.[2] In other words, when an area became no man’s land, it was to the benefit of all nations for the lawlessness to be eliminated by whoever had the strength and will to do so.

Eliminating no man’s land in North America was a task that occupied the United States for more than a century. Eventually, the United States would reach from ocean to ocean and would gain the military might of a great nation. However even as the twentieth century dawned, the United States struggled to bring law to all of its territory. During the century of expansion, some in the United States saw potential in the acquisition of territory in the south, particularly in Central America. Others recognized the difficulty of governing such a vast nation. Faced with lawlessness due to revolt in Mexico during World War I, Wilson authorized the U.S. Army’s invasion of Mexico. However, Wilson recognized the value of having a buffer zone south of the border and eventually withdrew the army. In order to ensure that the southern nations created a friendly buffer zone, the United States supported governments that kept the peace, even though keeping the peace came at the expense of basic human rights. Like many leaders before and since, President Wilson put aside ideology and accepted peace-by-force as being better than lawlessness.

Reflecting on history, some leaders have sought security by building huge empires, some by establishing buffer zones, and others by the targeted elimination of no man’s land. Regardless of the method men and nations have chosen, it is clear that international law, notions of liberty and self-determination, and hope for world peace are always secondary to the goal of eliminating the threat posed by no man’s land.

Endnotes:

[1] Samuel Flagg Bemis, John Quincy Adams and the Foundations of American Foreign Policy (New York: Alfred A. Knopf, 1956), 315-316.

[2] James Monroe, “Spain and the Seminole Indians,” American Memory, Library of Congress, (March 25, 1818),  http://memory.loc.gov/cgi-bin/ampage?collId=llsp&fileName=004/llsp004.db&Page=183.