Tag Archives: Industrialization

Not Naive

In January 1789, the newly elected President George Washington wrote to his dear friend, Marquis de Lafayette, the following words.

While you are quarreling among yourselves in Europe – while one King is running mad – and other acting as if they were already so, but cutting the throats of the subjects of their neighbours; I think you need not doubt, My Dear Marquis we shall continue in tranquility here – And that population will be progressive so long as there shall continue to be so many easy means for obtaining a subsistence, and so ample a field for the exertion of talents and industry.

Washington, like so many of his countrymen, saw the American abundance of land and resources as a way to ensure the avoidance of foreign chaos, specifically the chaos that derives from overcrowding and the ills such chaos inspires. He wrote, “I see a path, as clear and as direct as a ray of light…Nothing but harmony, honesty, industry, and frugality are necessary to make us a great and happy people.”[1]

Men like Washington felt strongly that certain key moral principles would flourish in a land as abundantly blessed as America. As a leader of men for most of his adult life, he would not have been blind to the tendencies of human nature, but clearly he believed that those men dedicated to “industry and frugality” would prevail over those who sought slothful pursuits.  The United States was predominantly agrarian during those early years. Commerce, especially the trade of raw materials for finished goods, may have dominated the sea side areas of the new nation, but industrialization had not yet lured workers from the fields and into cities. Subsistence farming was still both the predominant occupation and an occupation that did not tolerate slothful pursuits. Washington was able to envision generations of “tranquility” rather than the chaos that derived from congested cities and limited resources. However, he was not naive to the realities of human nature; he simply could not foresee how quickly the world would change once industrialization took hold.

[1] George Washington, George Washington: Writings, vol. 91, Library of America (New York: Library of America, 1997), 717 – 718.

National Security: the Value of Nutrition and Education

In the years leading up to World War I, many progressive thinkers began to campaign for social reform. The industrial revolution changed society in many ways, not all of which were good for the nation or for national security. Unskilled labor and skilled labor alike were susceptible to the ills of urban life. Just as the war in Europe was igniting, one group of progressive reformers was introducing home economics textbooks and coursework into schools. Proper hygiene and good nutrition began to be taught alongside other subjects. Malnutrition and disease were viewed as ills which not only weakened society but undermined national well-being. The reformers who pushed for better living conditions and education for urban families gained a powerful ally when the United States entered WWI. The ally was the U.S. Army. When faced with a modern war, modern both in weaponry and technologically, the U.S. Army quickly discovered that it was no longer beneficial to go to war with illiterate soldiers. Modern war demanded healthy soldiers and demanded that the soldiers could communicate efficiently with each other. Basic health and literacy became a necessity for the modern army. The ground gained in understanding this truth was not easily won. The soldiers who fought in the war learned firsthand the value of both a healthy body and the ability to communicate with their fellow soldiers. Having a common language coupled with the ability to read and write in it would be something the returning soldiers would seek for their own children. These veterans would push for change. By the end of World War II the realities of modern war mandated the necessity of having a nation populated with citizens possessing basic health and education. Education and proper nutrition became a matter of national security.

Additional Reading:

  • Keene, Jennifer D. Doughboys, the Great War and the Remaking of America. Baltimore: The Johns Hopkins University Press, 2001.
  • National Security Act of 1947, Title X.
  • There were various publications designed to introduce Home Economics in the schools. Some have been scanned and can be found in different e-book collections. Original copies can be found through used bookstores. My favorites were authored by Helen Kinne and Anna M. Cooley.

No Man’s Land

A term older than World War I but popularized during that war, no man’s land refers to a stretch of land under dispute by warring parties, but it can also refer to lawless areas with little or no governing control. A buffer zone, on the other hand, is an area which provides a sense of protection from the enemy. When physical fortifications offer little protection, buffer zones can provide a perception of security. Nations great and small seek the perception of security when security is elusive. Treaties and alliances are traditional means of creating a sense of security, as is the creation of buffer zones. During the Cold War, the competing nations sought to expand their spheres of influence, thereby creating buffer zones between themselves and their enemies as their spheres grew. When the Cold War ended and the buffer zones were no longer needed, many of the buffer nations found themselves with fewer friends and with fewer resources to prevent lawlessness. These nations found it difficult to avoid the development of no man’s land within their borders.

The United States reasoned, even in the earliest days, that oceans made excellent buffer zones against the conflicts of Europe. Unsettled territories were adequate as buffers but only to a point. While unsettled territories didn’t pose a direct European threat, they were still loosely under the influence of powerful countries. Additionally they often attracted outlaws fleeing justice and smugglers seeking a base of operation near their markets. In 1818, Andrew Jackson decided to pursue a group of raiders into Florida. The problem was that Florida was owned by Spain and Spain had little ability to prevent lawlessness in the territory. When Jackson’s army crossed into Florida, he invaded a foreign nation. Without the consent of Spain, such an action created an international incident. Fortunately Secretary of State John Q. Adams was able to capitalize on Jackson’s actions, and convinced Spain that a treaty was better than a war. His reasoning for defending Jackson’s violation of Spanish sovereignty was that “it is better to err on the side of vigor.”[1] Certainly not the first time a nation chose a declaration of strength as its response to an international crisis of its own making, but possibly the first time such a response became national policy. As Secretary of State, Adams greatly influenced the foreign policy decisions of the president and authored much of what President Monroe presented to Congress. In March 1818, President Monroe declared to Congress that when a nation no longer governed in such a way as to prevent their lawlessness from spilling onto their neighbors, then the neighbors had the right to protect themselves and to seek justice even if it meant violating the sovereignty of another nation.[2] In other words, when an area became no man’s land, it was to the benefit of all nations for the lawlessness to be eliminated by whoever had the strength and will to do so.

Eliminating no man’s land in North America was a task that occupied the United States for more than a century. Eventually, the United States would reach from ocean to ocean and would gain the military might of a great nation. However even as the twentieth century dawned, the United States struggled to bring law to all of its territory. During the century of expansion, some in the United States saw potential in the acquisition of territory in the south, particularly in Central America. Others recognized the difficulty of governing such a vast nation. Faced with lawlessness due to revolt in Mexico during World War I, Wilson authorized the U.S. Army’s invasion of Mexico. However, Wilson recognized the value of having a buffer zone south of the border and eventually withdrew the army. In order to ensure that the southern nations created a friendly buffer zone, the United States supported governments that kept the peace, even though keeping the peace came at the expense of basic human rights. Like many leaders before and since, President Wilson put aside ideology and accepted peace-by-force as being better than lawlessness.

Reflecting on history, some leaders have sought security by building huge empires, some by establishing buffer zones, and others by the targeted elimination of no man’s land. Regardless of the method men and nations have chosen, it is clear that international law, notions of liberty and self-determination, and hope for world peace are always secondary to the goal of eliminating the threat posed by no man’s land.


[1] Samuel Flagg Bemis, John Quincy Adams and the Foundations of American Foreign Policy (New York: Alfred A. Knopf, 1956), 315-316.

[2] James Monroe, “Spain and the Seminole Indians,” American Memory, Library of Congress, (March 25, 1818),  http://memory.loc.gov/cgi-bin/ampage?collId=llsp&fileName=004/llsp004.db&Page=183.

Man or Machine: War in the 20th Century

World War I changed many facets of warfare, particularly where technology was concerned. The automobile, chemical weapons, and tanks stand out, but also do the airplanes, submarines, and machine guns. Some of these weapons were developed during the war and some were simply advanced beyond their pre-war status. The prevalence of the machine in World War I marked a dramatic shift in how the wars of the twentieth century would differ from the wars in previous centuries. These new machines created mass casualty beyond anything that Europe had ever experienced.

By the end of World War II and the beginning of the Cold War, nations would wonder if machines could be used to replace the soldier, or at least reduce the cost of human life. President Truman considered nuclear technology as a means to reduce ally casualties. After two devastating world wars, the notion of technology carrying the brunt of the work was appealing. The second half of the twentieth century provided opportunities for the theory to be tested. Air power became a key component in strategic planning. Bombardments from the air, whether from aircraft or from missiles located many miles from the conflict zone, devastated communities. All indications seemed to point to a day when machines would replace boots on the ground. However machines did not rout the enemy regardless of the devastation they created. As the twenty-first century dawned, warfare seemed to depart from the oceans and grand battlefields where the machine dominated and instead entered the villages and city streets where man could maneuver more adeptly. Despite all the technological development to the machines of war, man with natural adaptability time-and-time-again remained supreme.

War Fever

The dawn of the twentieth century had all the conditions necessary for an outbreak of war fever. The great nations and empires of the previous centuries were struggling to hold onto their power and status. New nations were expanding and seeking empire status in the wake of shifting colonial control and changes in markets. Revolution and nationalistic zeal challenged and destabilized the status quo. War was an opportunity to demonstrate national strength and military superiority. It was viewed as both a means to hold onto power and a means to gain power. War fever spread simply because those with power did not wish to see it dwindle and those without it wanted to gain what they believed was being denied to them. Modern technology made war destructive beyond measure, but technology also helped spread the propaganda necessary to enflame populations and maintain war fever.

During the mid-1800s under the leadership of Otto von Bismarck, Germany became a unified nation and strived to become a great industrialized power. Yet while Bismarck sought unification and political strength, Kaiser Wilhelm II longed for military glory and a strong German state which could withstand a simultaneous attack by its neighbors. Leading Germany on a path of militarization, the Kaiser ignited the sparks of flame that would lead to war fever in 1914. In a recent article, Professor Holger Afflerbach wrote that war was both dreadful and “glorious” with soldiers being granted “high social prestige” especially in nations were a militarization movement had taken root.[1] Germany was not the only nation to experience war fever or to glorify the honor of soldiering. Prior to World War I, war had been limited in its scope, at least for the most part. Armies battled armies and the civilian population as a whole was relatively unaffected by combat. Total war was a concept few had experienced directly. Technology and ease in transportation had begun to change warfare during the 1800s, but it would take World War I to bring these changes to public light. So in August 1914, when the call to war was made, men signed up for what they hoped would be a quick, glorious war. The armies of Europe swelled as a patriotic fervor fanned the flames of war fever.

Strangely enough, despite entrenched combat and the undeniable horrors of modern warfare, war fever spread to the United States in 1917. In an age where media was sympathetic to the nation and national causes, little of the true nature of war made it into the far-off homes of U.S. citizens. An ocean away, the horror of war was overshadowed by patriotic notions of rallying around the flag and racing to the aid of allies. In a great crusade to defend democratic liberty, the United States promoted war fever in order to fill its military ranks. In doing so, it demonstrated the value of industrial might in world affairs, and propelled itself to great power status in a world where the traditional balance of power was shifting. Germany had disrupted the balance of power established under the Concert of Europe when it unified and became an industrial force. Fearing its closest neighbors, Germany industrialized and militarized its nation making it a rival, and a threat to European stability and status quo. The United States, on the other hand, feeling less threatened by its neighbors had dedicated its energies to industrialization. Germany recognized the danger the United States posed as a major industrial nation, but calculated that the weak U.S. military structure would hinder a U.S. response to European war. Their calculations were wrong. The United States surprised the world with a rapid response. Due in part to a propaganda campaign which not only ignited war fever but used modern technology to spread it quickly and widely throughout the U.S. population, the United States went from anti-war to pro-war overnight, albeit the actual preparations involved in having an army ready to fight took a bit longer to manage.

War fever was a contagion that benefited from the notion that a limited war produced little disruption to the home front and would grant the nation, and its warriors, prestigious accolades. While World War I would demonstrate the brutality of modern war and introduce to Europe the horror of total war in a modern age, the lessons would not be universally comprehended. Deprivations of war would be greater understood by the end of World War II when aerial bombardment turned the home front into the frontline. The realities of modern war should have eradicated war fever entirely; however, the threat of war fever returned as total war became part of history and the notion of limited war reemerged as a prominent strategy during the Cold War. Much like a viral contagion, war fever could rage for a period and then die down until once again conditions were right for its return.

[1] Holger Afflerbach, “The Soldiers Across Europe Who Were Excited About World War I,” The Conversation, August 4, 2014, (accessed October 24, 2014), http://theconversation.com/the-soldiers-across-europe-who-were-excited-about-world-war-i-29807.

Unexpected Consequensces: Embargoes in the Early 1800s

Thomas Jefferson’s Embargo Act of 1807 did not work out the way he had planned. The restricted flow of British goods entering the United States spurred the development of U.S. manufacturing and changed society. Even the Embargo Act’s replacement the Nonintercourse Act, which allowed for trade with nations other than Great Britain or France, did not halt the changes occurring within the United States. Interestingly, two changes in domestic life resulted because of the embargo and war. These changes were the increased investment into cotton textile manufacturing, and the development of iron, particularly the production of cast iron. Eventually these developments in U.S. manufacturing would provide cheaper textiles for the home, and promote changes in the production of food due to the proliferation of the cooking stove.

In the early 1790s Samuel Slater helped develop the first modern cotton mill in the United States, but it would be two decades later, during the time of embargoes and war, that Francis Cabot Lowell’s textile mill system, which incorporated into one location the production of cotton thread and the finished woven cotton fabric, was built. These changes reduced the cost of production and increased productivity thereby making cotton fabric more available to the average household. Prior to these changes in cotton textile manufacturing, cotton was considered a luxury fabric. Whereas flax could be grown easily and turned into linen by the skilled spinners and weavers in the United States, cotton fabric was typically imported prior to the development U.S. manufacturing in the early 1800s.

In addition to the availability of less expensive textiles, changes in cooking methods were occurring due to the new technology of the cooking stove. While the first modern cooking stoves began appearing in the mid-1700s, it wasn’t until the end of the century that the major flaws had been worked out. Yet the cooking stove remained outside the reach of the average home due to the high cost of cast iron in the United States. The embargoes and the War of 1812 highlighted the need for increased domestic production of iron, and by the 1820s iron production had spread through Pennsylvania with Pittsburg becoming known as the “smoky city.”[1]The greater availability and affordability of cast iron stoves changed the way food was prepared. Not only were the stoves safer than open fires, they allowed the cook a greater range in what they prepared. No longer limited to a stewpot or spit, a woman could prepare a larger variety of food for her family without the need of additional labor in the home.

Thomas Jefferson had opposed men like Alexander Hamilton who had promoted the development of U.S. manufacturing. The ills of industrialization were not unknown to both Jefferson and Hamilton, but while Jefferson preferred to believe that the agrarian lifestyle was superior to manufacturing and would better promote liberty, men like Hamilton understood that economic power would be required to protect that very same liberty. The production of raw materials alone would not be enough to propel the United States to greatness. Without economic greatness, liberty would always be threatened by the dangers of imperialism and war. So while Jefferson’s embargo was meant to pressure the European nations into respecting U.S. sovereignty, it acted as an affirmation that U.S. manufacturing was vital to U.S. survival. The embargoes also helped change domestic life in the United States. Textiles for bedding and clothing became more available and less expensive, and the entire method of cooking was transformed. The U.S. response to international conflict in the early 1800s resulted in the unexpected consequence of increased investment in manufacturing which in turn changed society by transforming life in the home.

[1] Anne Madarasz, “Tracing the Smoky City,” Western Pennsylvania History, 2002, (accessed October 18, 2014),  https://journals.psu.edu/wph/article/viewFile/5111/4894.