The Great Leaps Forward


Anthony Stigliani


The Evolution of Human IntelligenceThe most distinguishing feature of Homo sapiens is the ability to think intelligently, which developed in accordance with its impressive nervous system. While almost all animals have brains, the human brain is unique in several respects. Compared to that of other species, the human brain is extremely large relative to body mass, and it consumes roughly 25 percent of the glucose metabolized by the body and 15 percent of the oxygen (Clark et al., 1999). All things considered, the human brain is one of the most complex biological systems ever to have evolved on Earth. Relatively little is known, however, about how the human brain came to be because brains cannot be fossilized. But skulls do leave a fossil record, and archaeologists have begun to piece together the history of human intelligence by examining changes in our ancestors’ skull morphology. With the relatively recent completion of the human genome, complementary research by geneticists is beginning to fill some of the gaps in the fossil record. Together, their research suggests that modern humans are on the tail end of (or maybe amidst) two overlapping stages of rapid brain development that set human intelligence apart from all other species. As shown in Figure 1, these adaptations allowed early hominids to spread around the world and adapt to a plethora of environments very different from the African savannah from which they evolved.

The Early Expansion

Until around two million years ago, nothing was particularly unique about our ancestors’ brains. Then, Homo habilis, the oldest member of our genus, began to anatomically diverge from other bipedal apes (e.g., Ponting, 1993). The most notable of Homo habilis’ physical adaptations was the rapid increase in the size of its brain (Lee & Wolpoff, 2003). As shown in Figure 2, the brains of our hominid ancestors grew slowly until the emergence of Homo habilis, which initiated an epoch of exponential brain growth. Perhaps not surprisingly, archaeological evidence of widespread stone tool use appears soon thereafter (Ponting, 1993). During the next million years, the cranial capacity of Homo habilis, at about 600 cm3, grew half again in size to around 900 cm3 in its descendant Homo erectus (Bruner, 2003). For the next 800,000 years or so, the brains of our ancestors continued to grow until almost 1,500 cm3. Only after the appearance of the first anatomically modern humans around 200,000 years ago did the size of the human brain stabilize, perhaps even shrinking in the last 100,000 years (Carr, 2005).

The Second Leap

Hominid Brain GrowthThe explosion in technological innovation ushered in by anatomically modern humans seems puzzling, given the fact that the human brain has not grown in the past 200,000 years. Some have proposed the size of the birth canal as the factor that ultimately limited brain size (Godfrey & Jacobs, 1981). As our ancestors grew larger brains the size of their bodies changed very little, which undoubtedly led to birth complications as infant heads became increasingly large. In response, females evolved wider hips and likely began to deliver their progeny before full term (when their heads were smaller). However, there is a finite limit to hip expansion, just as there are limits on premature birth. It is therefore possible that the costs of increasing brain size began to outweigh its benefits by the time anatomically modern humans emerged.

This by no means suggeststhat the human brain stopped evolving. Rather, it likely began to evolve in ways that did not require additional expansion of the brain. Recent evidence indicates that one of the ways in which this was achieved was by adding more anatomical and functional asymmetry to the brain. Although many animals’ brains have a few subtle asymmetries, the modern human brain is full of them (e.g., Zimmer, 2009). For example, specialized cortical structures like Broca’s area and Wernicke’s area, which are necessary for different aspects of language proficiency, are found in the left hemisphere. The part of the cortex opposite of Broca’s area, however, is called the fusiform face area, and damage to this structure has been shown to impair the recognition of human faces (Zimmer, 2009). Interestingly, research suggests that much of this asymmetry is restricted to the newer areas of the human brain (i.e., the frontal lobe, parietal lobes, and temporal lobes). In keeping with this notion, Khaitovich et al. (2008) found evidence that the parts of the human genome controlling metabolic functions of the brain began to rapidly change starting around 200,000 years ago. These genetic changes served to direct even more energy to early humans’ increasingly asymmetrical brains and is seen by some to represent a different stage of cognitive development that eventually produced behaviorally modern humans.

The Role of Diet in Brain Development

Advances in radio carbon dating technology have allowed archaeologists to more accurately estimate when and where the evolution of human intelligence took place. In turn, knowing when and where allows one to address the more complicated question of why we evolved as we did. Since evolution entails complex interactions between a species and its environment, certain aspects of proto-humans’ environments likely triggered the development of bigger, more asymmetrical brains. This being so, emerging evidence suggests that two fundamental changes to our ancestors’ diets may have been the impetus behind the waves of rapid neurological evolution that gave rise to behaviorally modern humans.

Becoming an Omnivore. The first momentous change was likely the incorporation of meat into the diets of early hominids. Although it is unclear exactly when our ancestors became omnivores, the weight of the evidence suggests that meat became an increasingly important source of food between two and three million years ago, predating the use of stone tools (Shipman & Walker, 1988). In fact, some of the oldest stone tools on archaeological record, which date back some 2.5 million years ago, are sharpened rocks probably used to scrape carcasses. This estimate is also confirmed by analyzing changes in the surface area of our ancestors’ molars over time. Between four and two million years ago, their molars grew from around 450 square mm to 750 square mm, compatible with a diet based almost entirely on fibrous plants. However, as early hominids like Homo habilis began to incorporate meat into their diets by scavenging the remains of large animals and hunting small game, the surface area of their molars shrunk, eventually to around 450 square mm, to accommodate more facile chewing of meat (Leonard et al., 2003).

While Homo habilis may have introduced meat to the hominid diet, archaeological evidence suggests that full adoption of the hunter-gatherer lifestyle probably began around 1.5 million years ago with the rise and spread of Homo erectus (Bunn & Ezzo, 1993). Consequently, the surface area of Homo erectus’ molars averaged just around 380 square mm while those of Homo sapiens measure in at around 320 (Leonard et al., 2003). What is crucial about these findings is that our ancestors’ molars seemed to begin to shrink before their brains began to grow. As shown in Figure 3 (note the slightly different estimated geological ages), Australopithecus robustus had markedly smaller molars than its slightly older relative, Australopithecus boisei, suggesting that Australopithecus robustus may have incorporated more meat into its diet. Nevertheless, Australopithecus robustus’ brain was roughly the same size as Australopithecus boisei's, perhaps lagging behind the evolution of smaller teeth.

Increased consumption of energy dense foods like meat likely enabled early hominids to extract more nutrients from their environment than they had previously. In these circumstances, natural selection should have favored those individuals who made use of the extra energy in the most adaptive way. If this was the case, then it seems that the most successful hominids invested much of this surplus energy to support the higher metabolic needs required for morew intelligent brains, instead of longer bones or stronger musculature characteristic of other predatory mammals. As their brains acquired a greater aptitude for thinking and reasoning they could hunt and gather even more successfully, which would have created an even greater surplus of food for thought. In addition, the energy density of meat allowed early hominids to take in less food mass each day, without reducing the net amount of nutrients attained. According to Aiello and Wheeler (1995), this change may have led to the increasingly smaller digestive tracts of early hominids, thereby freeing up even more space and energy for neurological development.

The Stone Age Kitchen. Learning to use fire allowed hominids to harness a resource that was, and has remained, inaccessible to all other species. Although researchers continue to debate exactly when our ancestors mastered fire, many would agree that the evidence for controlled fire use is ambiguous until around 500,000 years ago (e.g., Wuethrich, 2003; McCrone, 2000). At first, hominids likely used fire as a source of warmth, protection, and light. In time, however, its use expanded to a wide array of other activities including hunting, tool construction, and perhaps most importantly, cooking.

Cooking may have had the largest impact on human evolution because cooked meat requires significantly less energy to digest it than raw meat, thus freeing up even more energy in the diet that could be routed for brain development (Nixon. 2008). Cooked food does not need to be heated to body temperature, is easier to chew and digest, and less prone to carry harmful bacteria. While archaeologists generally agree that cooking was one of our ancestors’ early uses of fire, there is disagreement as to when this began. Barnicott (2005), for one, suggests that humans began to cook their food around 500,000 years ago. As cooking became more pervasive, the more energy rich diet may have allowed for a second wave of sophisticated, energy-demanding brain circuitry to evolve. According to researcher Philipp Khaitovich, prior to cooking hominids made “the same very boring stone tools for almost two million years” (quoted in Nixon, 2008). Around 200,000 years ago, however, hominids started to create new and better tools. Following these innovations, human ingenuity took off, creating language, agriculture, the steam engine, and, more recently, the computer, all in an eyeblink of evolutionary time.

Interestingly, Khaitovich et al. (2008) found evidence that, beginning around 200,000 years ago, the genes controlling the metabolic functions of Homo sapiens‘ brain began to evolve rapidly. While the human brain may have started to slowly decrease in size around 100,000 years ago (Carr, 2005), the findings of Khaitovich et al. (2008) suggest that it continued on a path of rapid evolution that is undetectable in the fossil record. Almost all of the metabolic adaptations they studied served to increase the brain’s energy expenditure, suggesting an analogous evolution of more advanced brain circuitry demanding more and more energy. Although preliminary, these findings add weight to the notion that a new wave of cognitive evolution began in the past 200,000 years and allowed for the development of language, agriculture, and the countless other artifacts of modern man.

The Future of Human Intelligence

Although human intelligence has evolved very quickly over the past 200,000 years, one should not underestimate the importance of the nearly two million years of evolution that gave birth to Homo sapiens and its big brain. The rise of a distinctly ‘human’ intelligence coincided with the incorporation of meat into our ancestors’ diet around two million years ago. Their brains began to grow quickly, but their technologies did not progress beyond simple stone tools until after the emergence of cooking about 500,000 years ago. In turn, the increase in available metabolic energy derived from eating cooked food (particularly cooked meat) may have allowed for a massive rewiring of the brain and the rise of some of our most ingenious cognitive abilities. Both cases show that, coupled with the forces of natural selection, what we eat plays a large role in who we are and what will become of us.

The lessons of evolution can tell us a lot about the importance of cooked food, energy efficiency, and development. Today, we live in a world of great wealth and incredible poverty. While the Western world enjoys a great abundance of energy-dense (cooked) food, an alarming proportion of humanity lives in extreme poverty, threatened by constant hunger and a lack of fuel for cooking and warmth. Moreover, the wasteful, inefficient use of energy by those of us fortunate enough to live in the developed world threatens to impoverish countless future generations. At this point, some may argue that our intelligence is so powerful that we need not worry about these concerns. Perhaps the best way to guard our intelligence, however, is to ensure that all people, now and in the future, will be able to enjoy an abundance of food for thought.


Literature Cited

Aiello, L.C., & Wheeler, P. (1995). The Expensive-Tissue Hypothesis: The Brain and the Digestive System in Human and Primate Evolution. Current Anthropology 36(2), 199-221.

Barnicott, N.A. (2005). Human Nutrition: Evolutionary Perspectives. Integrative Physiological & Behavioral Science 40(2), 114-117.

Bruner, E. (2007). Cranial shape and size variation in human evolution:xStructural and functional perspectives. Childs Nerv Syst 23, 1357–1365.

Bunn, H.T., & Ezzo, J.A. (1993). Hunting and scavenging by Plio-Pleistocene hominids: Nutritional constraints, archaeological patterns, and behavioural implications, J. Archaeol. Sci. 20, 365–398.
Carr, G. (2005, December). The proper study of mankind. The Economist 337, 2-12.

Clark, D.D., Sokoloff L., Siegel, G.J., Agranoff, B.W., Albers, R.W., Fisher, S.K., and Uhler, M.D. (1999). Basic Neurochemistry: Molecular, Cellular and Medical Aspects. Philadelphia: Lippincott, pp. 637–70.

Godfrey, L., & Jacobs, K.H. (1981). Gradual, autocatalytic and punctuational models of hominid brain evolution: A cautionary tale. Journal of Human Evolution 10(3), 255-272.

Khaitovich, P., Lockstone, H.E., Wayland, M.T., Tsang, T.M, Jayatilaka, S.D., Guo, A.J., Zhou, J., Somel, M., Harris, L.W., Holmes, E., Pääbo, S., & Bahn, S. (2008). Metabolic changes in schizophrenia and human brain evolution. Genome Biology 9(8), R124.

Le Journal De Net (2010). Volume cérébral des Hominidés. Retrieved on January 31, 2010, from http://www.journaldunet.com/science/biologie/dossiers/06/0608-memoire/8.shtml.
Lee, S., & Wolpoff, M.H. (2003). The pattern of evolution in Pleistocene human brain size. Paleobiology, 29(2), 186–196.

Leonard, W.R., Robertson, M.L., Snodgrass, J.J., & Kuzawa, C.W. (2003). Metabolic correlates of hominid brain evolution. Comparative Biochemistry and Physiology 136, 5–15.

McCrone, J. (2000, May). Fired up. New Scientist 166(2239), 30-34.

Nixon, R. (2008, August). Cooking and Cognition: How Humans Got So Smart. Retrieved February 1, 2010, from http://www.livescience.com/culture/080811-brain-evolution.html.

Ponting, C. (1993). A Green History of the World. New York: St. Martin’s Press.

Reed D.L., Smith, V.S., Hammond, S.L., Rogers, A.R., Clayton, & D.H. (2008). Genetic Analysis of Lice Supports Direct Contact between Modern and Archaic Humans. PLoS Biology 2(11), e340.

Shipman, P., & Walker, A. (1989). The costs of becoming a predator. J. Hum. Evol. 18, 373-392.

Wuethrich, B. (1998). Geological Analysis Damps Ancient Chinese Fires. Science Magazine 281(5374), 165.

Zimmer, C. (2009, May). The big similarities and quirky differences between our left and right brains. Retrieved February 1, 2010, from http://discovermagazine.com/2009/may/15-big-similarities-and-quirky-differences-between-our-left-and-right-brains/article_view?b_start:int=1&-C=.


Return to ENVS2 homepage

Send message to Swarthmore College Environmental Studies

last updated 1/25/10

webmaster