Literature Review

By exploring key points in present-day scholarship about the historical geographies of childhood illness, it is clear that there are ties between the health outcomes of children and the shared social understandings of medicinal practices and developments relating to childhood illnesses and diseases between 1860 and 1940. Through the emergence of modern medical technologies in epidemiology, along with shifting populations and changes in land value, the procedures for approaching childhood illness and the shared social notions of what it meant to treat sick children changed drastically during the late nineteenth and early twentieth centuries. 

As the first pediatric hospitals started opening in the mid-to-late 1800s, childhood health became increasingly relevant and essential in the eyes of the public (Hostetter, 2012). With facilities like hospitals focusing specifically on children’s illnesses and health, milestones like the development of germ theory and vaccines started coming about quickly, with new advancements happening rapidly (Preston and Haines 1991 p. 7). Despite this, attitudes surrounding these new technologies were riddled with insularity and hesitation (Kestenbaum and Feemster, 2015). The distrust that people had in modern technologies was not unwarranted, especially given the nature of the field of medicine, which was very new and largely unexplored. This distrust led people to depend on lay knowledge and folk remedies and develop their own approaches to curing ailments, several of which centered around home care and homeopathic treatments (Gillis, 2005). Further, significant changes to the demographic and physical landscapes of the United States affected how children were getting sick and which children would have access to remedies and care if they needed it.

Expanding industrial labor markets pushed youth populations and families to the midwestern and northern United States and into high-density population areas, particularly immigrants and Black families during the Great Migraton. As these high-density population areas grew, more children could access medical care and treatment, and infant mortality rates dropped (Greene, 1984). Social understandings of what a ‘healthy child’ looked like were changing as a basis of knowledge from new pediatric medical practices made it easy to compare baby to baby and narrow down metrics of the ideal healthy child (Stern, Markel, and Murray, 2002). By having so many newly populated areas rise in the industrial revolution, the metrics of the healthy child became a cornerstone of parenthood as the child took on a more precious identity that valued health and longevity – contrary to the expendable working-child of the farms and fields (Shulman, 2004). 

Shifting attitudes towards childhood illness during the nineteenth and twentieth centuries were indicative of development and modernization, the healthy and ideal child fitting into the newfound ideal of ‘modern’ American living. Children were considered vulnerable and in need of care – hospitals, vaccinations, and remedies were specifically targeting child illnesses for the first time. As the population grew and living in dense cities became more commonplace, children became an investment emblematic of the American Dream. To embody this dream, medical care, and technology modernized, affording children and families treatment and answers to the illnesses and diseases that once threatened American childhoods.