Show Summary Details

Page of

PRINTED FROM OXFORD REFERENCE ( (c) Copyright Oxford University Press, 2013. All Rights Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single entry from a reference work in OR for personal use (for details see Privacy Policy and Legal Notice).

Subscriber: null; date: 23 October 2019

Food Processing

The Oxford Encyclopedia of the History of American Science, Medicine, and Technology
Gabriella M. PetrickGabriella M. Petrick

Food Processing 

Just as other sectors of the U.S. economy became industrial in the second half of the nineteenth century, so too did food. Meatpacking and canning are perhaps the most common examples, but grain production and milling, baking (bread, cookies, cakes, and pies), and breakfast cereals also shifted toward industrial production by the turn of the twentieth century. By the 1920s, fruits and vegetables would also begin to fit neatly into an industrial system. Yet, it is wrong to say that the American diet was industrial before late in the twentieth century. Although the industrial revolution reshaped American life, it was an evolution in food processing that transformed what Americans ate (Panschar, 1956, p. 46). As the twin forces of industrialization and urbanization became more powerful in the early twentieth century, fewer Americans had the means (or perhaps the desire) to grow and process their own foods. The movement of rural Americans into higher-paying jobs in cities, often with minimal cooking facilities, led both native-born Americans and new immigrants like Anzia Yezierska and her family to rely on delicatessens, cafeterias, saloons, tearooms, and street vendors for their daily meals (Yezierska, 1925, p. 27; Turner, 2009, pp. 217–232).


Because wheat was so important to the European diet, it was commonly referred to as “the King of Cereals.” Its standing in the American context was no different. Early milling was dominated by the gristmill, featuring heavy millstones that crushed grain into flour. One of the first American innovators in milling was Oliver Evans, with his gravity-driven automated mill. His mill, perfected by 1787, relied on conveyors to move grain and flour as it was ground, sieved, dried, and stored from the top to the bottom of his building. His elevator (an endless-bucket system inside a closed chute) and hopper-boy (two radial arms set with teeth that slowly raked the flour to cool it after grinding) were widely adopted by millers throughout the country. Although Evans’s invention relieved much of the hard work and labor required in milling and incrementally improved quality (Pursell, 1995, pp. 27–28), it was the importation of Hungarian rolling technology in the 1870s and the use of “high milling” of the middling purifier, which separated the bran, germ, and endosperm and produced very-high-quality patent flour from hard wheat, that transformed Minneapolis’s more traditional mills into the truly industrial factories that made “Mill City” into the center of American flour production (Vulté and Vanderbilt, 1916, pp. 61–63). By the 1880s, rolling and sifting technologies allowed steam-softened wheat to be milled into consistent blends of flour that gave millers, and ultimately bakers, control over their products. In combining protein-rich hard wheat with starch-rich soft wheat, millers could produce specialized flours for all sections of the baking industry as well as for home use. The three major classes of flour were bread (high protein/low starch), cake (high starch/low protein), and all purpose (intermediate mix of protein and starch), used for products that need some gluten development, but also require a light crumb, like biscuits, cookies, crackers, and pie dough (Panschar, 1956).

An additional advantage of flour produced in the industrial flour mills of Minneapolis and other northwestern cities was that it was relatively shelf stable. Unlike flour produced using traditional millstone technologies, which ground the oily germ into the flour, rolling-mill flour did not become rancid quickly (Vulté and Vanderbilt, 1916, p. 63). Flour’s new stability allowed it to be shipped farther via expanding railroads and stored longer in urban warehouses, transforming a once perishable product into a commodity (Cronon, 1991). In scaling up and further refining flour production, companies like Washburn-Crosby, General Mills, and Pillsbury made more flour available more cheaply to all Americans than ever before (Chandler, 1977). This flour was also more predictable in its protein and starch content, allowing bakers and housewives to make more consistent products. The flavor, color, and texture of industrial flour also met Americans’ preference for fine white flour (Cummings, 1940; Levenstein, 1988, p. 22). The availability of large volumes of flour at any time of year encouraged growth in the baking industry.

With the development of turbomilling in the late 1950s, millers no longer needed to blend hard and soft wheat to create patent flours because the technology enabled millers to separate protein molecules from starch molecules. Using turbomilling, which combined air-separation, flow-dynamics, and centripetal-force technologies, millers produced an even wider array of protein/starch blends in flour (Larson, 1959, pp. 194–197). Industrial bakers could at last control the protein and starch content of their dough with exacting precision, giving them more consistency in their product as it flowed through the factory.


Although corn has a much longer history in America than wheat, its significance to food processing lies in meat rather than grain production. In fact, cornmeal is more closely linked to deprivation and malnutrition than any other cereal produced in the United States, despite its importance to the southern diet both during and after slavery. Even before the Civil War, corn fattened both hogs and cattle across the nation. With the closing of the West and the rise of Chicago as the center of railroading and meat processing, corn moved from the hinterlands into midwestern cities to finish pork and, increasingly, beef.

The shift from Cincinnati to Chicago as the heart of American meat production signaled the ascendance not only of railroads but also of beef (Horowitz, 2006, p. 29). With the opening of the Ohio River Valley in the early 1800s, Cincinnati became a center for pork slaughtering because of its location. River and canal systems brought country pigs to the city to be converted into meat or, more precisely, barrel pork. Industrial pork was not fresh, but rather cured and salted so that it could be shipped vast distances. It was also seasonal when farmers took advantage of the pigs’ summer foraging and additional weight from corn to slaughter in the fall and winter. The same rivers that brought pigs into the city facilitated pork’s dissemination to eastern cities and western farmers. By midcentury, it was uncommon for middling Americans not to have meat, in the form of cured pork, on a daily basis. Even the poorest American could afford pork regularly (Horowitz, 2006, pp. 12–13). The key to producing barrel pork was the mechanization of the slaughterhouse. Pigs were herded to the top of a four-story building, where they began their journey from animal to food. Once at the top, animals were struck in the head with a mallet and then had their throats cut and were hung from their hind feet to be bled out. The pig flowed from one operation to another until it was cut into pieces and layered with salt, saltpeter (potassium nitrate), and sugar and then filled with water. The barrels of pickled pork could be stored and shipped great distances to feed Americans from shore to shore.

As railroads began to traverse the Mississippi Valley, Chicago became the center of meatpacking because it connected farmers to cities (Cronon, 1991, pp. 73–74). Not only did the vast planes of the trans-Mississippi West produce ever more wheat and corn, but also its grasslands were ideal for grazing cattle. The destruction of the Plains Indians and the buffalo allowed ranchers to raise cattle in ever-larger numbers (Cronon, 1991, pp. 213). New railheads allowed live cattle to move quickly from field to slaughter in urban areas. As the rail hub linking the West to the East, Chicago became a logical place for slaughtering. Although Chicago meatpackers did not invent the disassembly line, they certainly perfected it, largely by adding refrigeration to both the slaughterhouse and the packing shed. Keeping meat cold rather than preserved transformed Americans from pickled pork eaters into fresh beef eaters (Horowitz, 2006, pp. 44). Gustavus Swift’s refrigerated railcar solved the shipping problem. Industrial refrigeration in the plant, on the rails, in the branch house, and at the butchers provided fresh meat year-round. Mass beef was also cheap beef, and thus pork became the “other red meat” and beef became synonymous with American prosperity in the first decade of the twentieth century. By colluding with railroads and forcing many local slaughterhouses and distributors out of business, the Meat Trust controlled 90 percent of the chilled beef in the country at the turn of the twentieth century. It was not until the 1970s, when the packinghouse moved closer to feedlots and vacuum packing enabled processors to portion meat, that butchers were completely driven out of meat processing. Rather than shipping a side of beef to towns and cities, now beef simply came in a box and unskilled employees could wrap, weigh, and price it for consumers to pick up in the meat department (Horowitz, 2006, pp. 144).

When Herbert Hoover promised Americans a chicken in every pot, he was promising them an expensive and special food. Chicken did not become an everyday item until after World War II. Although chickens were ubiquitous on farms throughout the nineteenth century, they were largely eaten only after their laying days were over. Eggs rather than meat were a chicken’s contribution to the American diet into the twentieth century. As eastern cities grew along the Atlantic coast, specialized chicken farmers brought their live broilers to markets. These birds varied dramatically in quality and consumers rarely knew whether there would be enough meat on the chicken to feed a family. Throughout the 1920s and 1930s, scientists at land-grant universities sought to develop heartier broilers that could be mass produced. During the early 1920s, poultry researchers solved one of the major limitations of industrial broiler production: leg weakness. By simply adding cod liver oil to poultry feed, agricultural scientists increased the vitamin D in the chicken’s diet and prevented leg weakness (Boyd, 2001, p. 638). In solving the vitamin deficiency, boiler hens could now be raised indoors and throughout the year rather than just in the warmer months. Raising chickens in confinement gave farmers the controls they needed to intensify production. With virtually all farms having electrification by the 1950s, broiler production became truly industrial as heated and air-conditioned henhouses spread across the South and Midwest.

Expanding production quickly translated into cheaper chicken. By the 1960s, a plump chicken in every pot was no longer a political slogan, but a reality. In addition to expanding the quantity of birds available for market, the chicken industry also made the broiler meatier. Delaware’s “Chicken of Tomorrow” contest sought to create broad-breasted meaty chickens to maintain the state’s status as the poultry capital of the country. Between 1948 and 1951, contest winners provided the breeding stock for virtually the entire industry, thus redefining chicken as a single type of meaty bird. These improved chickens weighed approximately 50 percent more in 1965 than they did in 1935 and by the mid-1990s broilers were almost double the weight of Depression-Era birds (Boyd, 2001, p. 637). The use of subclinical antibiotics in poultry feed facilitated weight gain in birds and helped to suppress disease, which was a chronic problem, in new massive facilities (Boyd, 2001, p. 647). The transformation of chicken from a luxury item to a cheap dietary staple depended not only on more and better birds, but also on an industrial system that integrated feed, vaccines, antibiotics, waste management, automated disassembly lines, rural electrification, refrigerated trucks, and paved roads, among many other features of postwar life.


Dairy farming has a very long history in the United States, dating back to the first settler in the Massachusetts Bay colony. In the Jeffersonian world of the yeoman farmer, the dairy cow played a critical role. Cows not only provided manure essential for the mixed farming of mid-Atlantic and northeastern farmers, but also converted cellulose into protein though milk (Stoll, 2002, p. 49). Although milk was highly perishable, it could be relatively easily processed into other products, including butter and cheese, that could be stored for much longer periods of time, sometimes even for many months in cool weather.

Throughout much of the nineteenth and early twentieth centuries, dairying was an unrelenting, labor-intensive way to make a living. The defining feature of dairying is that even in the early twenty-first century lactating cows must be milked in the morning and in the evening. Managing the roughly 120 pounds of urine and manure each cow excreted daily only added to a farmer’s work. Even when cows were grazing in the field, farmers needed to continually move them from one spot to the next so that the manure would be evenly spread and not kill or overnourish the field, rendering it unusable for several years (Stoll, 2002, p. 52). Until the twentieth century, dairying was a local endeavor, with cows living very close to consumers. In New York and other cities, cows were very common until the sanitary movement of the 1880s sought to separate animals from humans to prevent a host of diseases (Goodwin, 1999), not to mention concerns over the quality of swill milk. Cows fed on the spent grains of local brewers produced milk that was low in fat and had a blue hue. Many women and reformers thought this milk was dangerous. Although the milk was low in fat, it was not in fact dangerous and spent grains continued to be used as a high-protein component of dairy feed in the early twentieth century. It was the growth of large cities, the speed of railroads, and the availability of ice for refrigeration that allowed country milk to move quickly into urban markets (DuPuis, 2002, p. 5). These systems supported the sanitary movement’s efforts to divorce consumers from producers. Yet, given milk’s susceptibility to infection and rancidity, speed, ice, and country air were not always enough. Louis Pasteur’s work on bacteria helped transform milk from “white poison” to a nutritious and wholesome food (DuPuis, 2002, pp. 5 and 77). With the spread of pasteurization, dairies shifted from small family operations into large factories that converted dangerous raw milk into a safe commodity that could be delivered right into American homes on a daily basis. From the 1910s through the 1940s, milk became an idealized new staple food that nourished generations of Americans (DuPuis, 2002). Yet by the Depression, dairy farmers felt the strain of the overproduction of milk (Hamilton, 2008, pp. 164–165). The economic pressure on dairy farmers only increased with the spread of paper cartons and milk trucks in the 1950s. These technologies not only allowed large commercial dairies that pasteurized and distributed milk to urban areas, suburban districts, and towns to reduce costs, but also kept milk prices low, supporting the Americans’ demand for cheap and plentiful milk (Hamilton, 2008, pp. 165–175). Postwar technological systems and suburbanization conspired to force small dairy operations out of the market, leaving only the largest and most productive operations to supply ever-cheaper milk across the country.

The market for milk largely echoed the market for meat, fresh fruits and vegetables, baked goods, and many other food products. Bigger facilities that utilized the most advanced technological systems provided economies of scale and minimized cost to consumers. Whereas consumers saw their grocery bills fall and it was easier to feed growing families more nutritiously than ever before, small and medium-size farmers were squeezed between falling prices and rising input costs (Anderson, 2008). By the time John F. Kennedy brought Camelot to Washington, D.C., in 1960, an entirely new era in food and food processing began to radically transform the American diet. These new, highly processed foods, including canned goods, industrial pies and cakes, soda pop, and frozen orange juice, moved beyond staple products like grains, meat, and milk. The age of truly industrial food arrived with the Baby Boom. By the 1980s, boomers/yuppies sought out a wide variety of food options, many of which were hyperprocessed and needed no preparation at all. Microwavable “ethnic” foods and a wide array of snack foods reflected Americans’ embrace of a multicultural palate while at the same time reducing the need to cook. The advent of hyperprocessed food coincided with larger structural changes in the United States, including deindustrialization, dual-income families, rising divorce rates, the farm crisis, latch-key kids, and the obesity “epidemic.” From the second half of the twentieth century through the first decade of the twenty-first century, processed foods made women’s lives easier while at the same time inciting fierce debates over the health of the nation. At the same time, technological systems and globalization allowed for more fresh food to flow at lower prices into American markets to satisfy the United States’ hunger for authentic ethnic foods and fresh foods out of season. By the 1990s, what was and was not a processed food was almost impossible to determine, even for the organic food movement, which sought to reverse the previous 150 years of food industrialization while simultaneously embracing much of the century’s technological advances (Guthman, 2004).

[See also Agricultural Education and Extension; Agricultural Experiment Stations; Agricultural Technology; Canals and Waterways; Food and Diet; Morrill Land Grant Act; Pure Food and Drug Act; Railroads; Refrigeration and Air Conditioning; Rivers as Technological Systems; Rural Electrification Administration; and Technology.]


Anderson, J. L. Industrializing the Corn Belt: Agriculture, Technology, and the Farm Belt. DeKalb: Northern Illinois University Press, 2008.Find this resource:

    Boyd, William. “Making Meat: Science, Technology, and American Poultry Production.” Technology and Culture 42, no. 4 (October 2001): 631–664.Find this resource:

      Chandler, Alfred D. The Visible Hand: The Managerial Revolution in American Business. Cambridge, Mass.: Belknap Press of Harvard University Press, 1977.Find this resource:

        Cronon, William. Nature’s Metropolis: Chicago and the Great West. New York: W. W. Norton and Company, 1991.Find this resource:

          Cummings, Richard Osborn. The American and His Food: A History of Food Habits in the United States. Chicago: University of Chicago Press, 1940.Find this resource:

            DuPuis, E. Melanie. Nature’s Perfect Food: How Milk Became America’s Drink. New York: New York University Press, 2002.Find this resource:

              Goodwin, Lorine Swainston. The Pure Food, Drink, and Drug Crusaders, 1879–1914. Jefferson, N.C.: McFarland & Company, 1999.Find this resource:

                Guthman, Julie. Agrarian Dreams: The Paradox of Organic Farming in California. Berkeley: University of California Press, 2004.Find this resource:

                  Hamilton, Shane. Trucking Country: The Road to America’s Wal-Mart Economy. Princeton, N.J.: Princeton University Press, 2008.Find this resource:

                    Horowitz, Roger. Putting Meat on the American Table: Taste, Technology, Transformation. Baltimore: Johns Hopkins University Press, 2006.Find this resource:

                      Larson, Robert A. “Milling.” In The Chemistry and Technology of Cereals as Food and Feed, edited by Samuel A. Matz, pp. 194–197. Westport, Conn.: AVI Publishing Company, 1959.Find this resource:

                        Levenstein, Harvey. Revolution at the Table: The Transformation of the American Diet. New York: Oxford University Press, 1988.Find this resource:

                          Panschar, William F. Baking in America: Economic Development. Vol. I. Evanston, Ill.: Northwestern University Press, 1956.Find this resource:

                            Pursell, Carroll. The Machine in America: A Social History of Technology. Baltimore: Johns Hopkins University Press, 1995.Find this resource:

                              Stoll, Steven. Larding the Lean Earth: Soil and Society in Nineteenth-Century America. New York: Hill and Wang, 2002.Find this resource:

                                Turner, Katherine Leonard, “Tools and Spaces: Food and Cooking in Working-Class Neighborhoods, 1880–1930.” In Food Chains: From Farmyard to Shopping Car, Warren Belasco and Roger Horowits, eds., pp. 217–232. Philadelphia: University of Pennsylvania Press, 2009.Find this resource:

                                  Vulté, Herman T., and Sadie V. Vanderbilt. Food Industries: An Elementary Text-Book on the Production and Manufacture of Staple Foods. Easton, Pa.: Chemical Publishing Company, 1916.Find this resource:

                                    Yezierska, Anzia. Bread Givers. New York: Persea Books, 1925.Find this resource:

                                      Gabriella M. Petrick