Disclaimer
• Your life and health are your own responsibility.
• Your decisions to act (or not act) based on information or advice anyone provides you—including me—are your own responsibility.

Categories

The Paleo Diet For Australopithecines: Approaching The Meat Of The Matter (Big Brains Require An Explanation, Part IV)

In Part III, we established the following:

  • Bipedalism among human ancestors is associated with a dietary shift away from soft, sugar-rich fruit, and toward hard, fibrous, ground-based foods like nuts, root vegetables, insects, and mushrooms. (And perhaps some meat, though the evidence is inferential.)
  • Both bipedalism and this dietary shift occurred while our ancestors were still forest-dwellers—before we moved into savanna and grassland habitats.
  • Both bipedalism and this dietary shift preceded the massive increase in our ancestors’ brain size.
  • Therefore, neither fruit, nor potatoes, nor walking upright made us human.

Once again, I am giving what I believe to be the current consensus interpretation of the evidence…and where no consensus exists, I offer what I believe to be the most parsimonious interpretation.

(This is a multi-part series. Go back to Part I, Part II, Part III.)

A Quick Recap

4.4 million years ago, Ardipithecus ramidus still had a brain the size of a modern chimpanzee, but was a facultative biped partially adapted to a ground-based diet. By 4.1 MYA, Australopithecus anamensis had been selected for more complete dietary adaptation:

Science 2 October 2009: Vol. 326 no. 5949 pp. 69, 94-99
Paleobiological Implications of the Ardipithecus ramidus Dentition
Gen Suwa, Reiko T. Kono, Scott W. Simpson, Berhane Asfaw, C. Owen Lovejoy, Tim D. White

Ar. ramidus lacks the postcanine megadontia of Australopithecus. Its molars have thinner enamel and are functionally less durable than those of Australopithecus but lack the derived Pan pattern of thin occlusal enamel associated with ripe-fruit frugivory. The Ar. ramidus dental morphology and wear pattern are consistent with a partially terrestrial, omnivorous/frugivorous niche.”

And the Laetoli footprints show that hominins were fully bipedal by 3.7 MYA, though we have no evidence for brain size until…

Australopithecus afarensis: Upright Gait, Smaller Body, Bigger Brain

Australopithecus afarensis lived from approximately 3.9 to 2.9 MYA. (Once again, these are human-drawn distinctions between a continuum of hominin fossils.) It was slightly shorter than Ardipithecus (3’6″) and weighed much less: 65# versus 110#. The famous “Lucy” fossil is about 40% of an A. afarensis skeleton from 3.2 MYA.

One interpretation of Lucy

Lucy might have looked like this.

Additionally, its back had a similar double curve to modern humans; its arms were shorter than Ardipithecus; its knees support an upright gait, and its feet had arches like ours—meaning that it was fully bipedal, and that A. afarensis is very likely the hominin which made the Laetoli footprints.

This is a recent finding: only last year did its discoverers announce that they had found a foot bone from A. afarensis which appears to settle this long-simmering question.

Science 11 February 2011: Vol. 331 no. 6018 pp. 750-753
Complete Fourth Metatarsal and Arches in the Foot of Australopithecus afarensis
Carol V. Ward, William H. Kimbel, and Donald C. Johanson

“A complete fourth metatarsal of A. afarensis was recently discovered at Hadar, Ethiopia. It exhibits torsion of the head relative to the base, a direct correlate of a transverse arch in humans. The orientation of the proximal and distal ends of the bone reflects a longitudinal arch. Further, the deep, flat base and tarsal facets imply that its midfoot had no ape-like midtarsal break. These features show that the A. afarensis foot was functionally like that of modern humans and support the hypothesis that this species was a committed terrestrial biped.

Most importantly, A. afarensis’ brain was much larger than Ardipithecus: 380-430cc versus 300-350cc. This means that selection pressure was favoring bigger brains as early as 4 million years ago, while allowing our ancestors’ bodies to shrink dramatically.

Now we’re getting to the meat of the problem. What could have caused this selection pressure?

“Is It Just Me, Lucy, Or Is It Getting Colder?”

During the Pliocene (5.3-2.6 MYA), the Earth’s climate—though far warmer than today’s—become cooler, drier, and more seasonal (see the temperature graphs and detailed explanation in Part I), a multi-million-year trend which began with the Middle Miocene Disruption around 14.5 MYA. Consequently, African forests were shrinking, and savannas and grasslands were growing in their place.

With less forest available to live in, some number of our ancestors faced a stark choice: adapt to living outside the forest, or die out. Those that stayed in the trees became what we know today as chimpanzees and bonobos. Those that eventually left became our ancestors—the hominins.

PNAS August 17, 2004 vol. 101 no. 33 12125-12129
High-resolution vegetation and climate change associated with Pliocene Australopithecus afarensis
R. Bonnefille, R. Potts, F. Chalié, D. Jolly, and O. Peyron

Through high-resolution pollen data from Hadar, Ethiopia, we show that the hominin Australopithecus afarensis accommodated to substantial environmental variability between 3.4 and 2.9 million years ago. A large biome shift, up to 5°C cooling, and a 200- to 300-mm/yr rainfall increase occurred just before 3.3 million years ago, which is consistent with a global marine δ18O isotopic shift.

Our results show that a diversity of biomes was available to A. afarensis. Recovery of hominin fossils through the entire stratigraphic range suggests no marked preference by A. afarensis for any single biome, including forest. Significant cooling and biome change had no obvious effect on the presence of this species through the sequence, a pattern of persistence shared by other Pliocene mammal taxa at Hadar and elsewhere (6, 27, 32). We hypothesize that A. afarensis was able to accommodate to periods of directional cooling, climate stability, and high variability.

As we found in Part I, and as we’ve seen by the chimp-sized brains of Ardipithecus, shrinking habitat does not explain increased brain size by itself—but it does provide an incentive to find ways to live in marginal habitat, or entirely different biomes. And it’s clear that bipedalism would be an advantage in forest margins and open forests, where direct travel from tree to tree wasn’t possible. In addition, more light reaching the ground would mean more food available on the ground, versus up in the tree canopy—so bipedal ground-dwelling would have been a good survival strategy in forest habitat that was marginal for a tree-dweller.

My interpretation of the evidence is that bipedalism did not cause brain expansion, but it was a necessary precondition. It allowed our ancestors to expand beyond the forest margin—and it freed up our ancestors’ hands for other tasks, such as…

How Bipedalism Enables Tool Use, Re-Use, and Manufacture

Facultative bipeds, which cannot walk on two legs for very long, can’t carry tools around with them: they must make a tool out of whatever materials exist near the point of use, and discard it soon after. Therefore, the tools they make must remain relatively simple, since they can’t spend too much time making single-use items—and it greatly constrains the raw materials they can use. (Yes, I’m ignoring any hypothesis that gives Ardipithecus ramidus the ability to construct backpacks.)

In contrast, full bipeds can carry around their tools in anticipation of needing them, and can keep them for future use. Therefore, they can spend the time and effort to make complex, reusable tools—and they can use any raw materials they have access to, not just those near the point of use.

We know that modern chimpanzees make spears, termite sticks, and other wooden tools—but is there evidence for tool use previous to the Oldowan industry, 2.6 MYA?

Recall that the Oldowan industry marks the beginning of the Paleolithic age, and happens to coincide with the beginning of the Pleistocene epoch. (If these terms are confusing you, I explain them in Part II.)

Rocks, Meat, and Marrow in the Pliocene

Nature 466, 857–860 (12 August 2010) — doi:10.1038/nature09248
Evidence for stone-tool-assisted consumption of animal tissues before 3.39 million years ago at Dikika, Ethiopia
Shannon P. McPherron, Zeresenay Alemseged, Curtis W. Marean, Jonathan G. Wynn, Denné Reed, Denis Geraads, René Bobe, Hamdallah A. Béarat

“On the basis of low-power microscopic and environmental scanning electron microscope observations, these bones show unambiguous stone-tool cut marks for flesh removal and percussion marks for marrow access. … Established 40Ar–39Ar dates on the tuffs that bracket this member constrain the finds to between 3.42 and 3.24 Myr ago, and stratigraphic scaling between these units and other geological evidence indicate that they are older than 3.39 Myr ago.”

It’s fair to say that no one knows what to do with this particular piece of evidence, so it tends to simply get ignored or dismissed. What we know is that the researchers found several ungulate and bovid bones, dated to 3.4 MYA, which were scraped and struck by rocks. The scrapes are not natural, nor are they from the teeth of predators, and they appear to date from the same time as the bones.

A bone at Dikika

One of the bones at Dikika. The reality of paleontology is far less exciting than the hypotheses it generates.

Unfortunately, no stone tools or fossil hominins were found there, so we can’t say for sure who made them. But the simplest interpretation is that a hominid used a rock to scrape meat off of the bones of large prey animals, and to break them open for marrow.

It is likely that the reason this evidence isn’t more well-accepted is because the researchers make one huge assumption: that the scrape marks were made by deliberately fashioned stone tools, 800,000 years before the first evidence we have of stone tool manufacture—even though no such tools were found.

I believe the most parsimonious interpretation is that the scrape marks were indeed made by Australopithecus afarensisusing one of the naturally-occurring volcanic rocks found in abundance in the area. Given the slow pace of technological change (millions of years passed between major changes in stone tool manufacture, and that’s for later hominins with much larger brains than A. afarensis), it would be extremely surprising if naturally-occurring sharp rocks hadn’t been used for millions of years before any hominin thought to deliberately make them sharper—

It’s Not Just The Discovery…It’s The Teaching And The Learning

—and, more importantly, before their children were able to learn the trick, understand why it was important, and pass it on to their own children.

Those of you who were able to watch the documentary “Ape Genius”, to which I linked in Part I, understand that intelligence isn’t enough to create culture. In order for culture to develop, the next generation must learn behavior from their parents and conspecifics, not by discovering it themselves—and they must pass it on to their own children. Chimpanzees can learn quite a few impressive skills…but they have little propensity to teach others, and young chimps apparently don’t understand the fundamental concept that “when I point my finger, I want you to pay attention to what I’m pointing at, not to me.”

So: the developmental plasticity to learn is at least as important as the intelligence to discover. Otherwise, each generation has to make all the same discoveries all over again. It is theorized that this plasticity is related to our less-aggressive nature compared to chimpanzees…but that’s a whole another topic for another time.

In conclusion, the Dikika evidence pushes meat-eating and stone tool-using (though not stone tool-making) back to at least 3.4 MYA, well into the Pliocene. And though we’re not sure whether that meat was obtained by hunting, scavenging, or both, we can add it to the other foods that we’re reasonably sure formed its diet to produce the following menu:

The Paleo Diet For Australopithecus afarensis

Eat all you can find of:

  • Nuts
  • Root vegetables
  • Insects
  • Mushrooms
  • Meat (particularly bone marrow)

Eat sparingly:

  • Fruit (your tooth enamel won’t withstand the acids)
  • Foliage (your teeth aren’t shaped correctly for leaf-chewing)

In other words, A. afarensis was most likely eating a diet within the existing range of modern ancestral diets—3.4 million years ago.

The only major addition to this diet previous to the appearance of anatomically modern humans is the gathering of shellfish, known from middens dated to 140 KYA at Blombos Cave.

Our Takeaway (so far)

  • Our ancestors’ dietary shift towards ground-based foods, and away from fruit, did not cause an increase in our ancestors’ brain size.
  • Bipedalism was necessary to allow an increase in our ancestors’ brain size, but did not cause the increase by itself.
  • Bipedalism allowed A. afarensis to spread beyond the forest, and freed its hands to carry tools. This coincided with a 20% increase in brain size from Ardipithecus, and a nearly 50% drop in body mass.
  • Therefore, the challenges of obtaining food in evolutionarily novel environments (outside the forest) most likely selected for intelligence, quickness, and tool use, and de-emphasized strength.
  • By 3.4 MYA, A. afarensis was most likely eating a paleo diet recognizable, edible, and nutritious to modern humans.
  • The only new item was large animal meat (including bone marrow), which is more calorie- and nutrient-dense than any other food on the list—especially in the nutrients (e.g. animal fats, cholesterol) which make up the brain.
  • Therefore, the most parsimonious interpretation of the evidence is that the abilities to live outside the forest, and thereby to somehow procure meat from large animals, provided the selection pressure for larger brains during the middle and late Pliocene.

Live in freedom, live in beauty.

JS


We’re not done yet…in fact, we’re not even to the Paleolithic! Continue to Part V, “Why Are There Southern Apes In Ethiopia?”

Are you enjoying this series, or is it too abstruse for you? Please leave a comment and let me know!

Big Brains Require An Explanation, Part III: Optimal Foraging Theory, And Our Story Begins On Two Legs

(This is Part III of a multi-part series. Go back to Part I, Part II.)

Our Story So Far

  • It is not enough to state that the availability of high-quality food allowed our ancestors’ brains to increase in volume from ~400cc to ~1500cc between 2.6-3 MYA and 100-200 KYA. We must explain the selection pressures that caused our brains to more than triple in size—instead of simply allowing us to increase our population, or to become faster or stronger.
  • To gloss over this explanation is a teleological error. It assumes that evolution has a purpose, which is to create modern humans.
  • Climate change is most likely a factor—but it is insufficient, by itself, to create this selection pressure.
  • The Paleolithic is an age defined by the use of stone tools (“industries”) to assist in hunting and gathering. It began approximately 2.6 MYA, with the first known stone tools, and ended between 20 KYA and 5 KYA, depending on when the local culture adopted a Mesolithic or Neolithic industry.
  • The Pleistocene began exactly 2.588 MYA and ended 11,700 BP, and is defined by the age of specific rock (or ice) formations.
  • Therefore, if we wish to locate an event precisely in time, we need to speak in terms of geological time—the Pliocene and Pleistocene epochs. If we wish to identify an event relative to human technological capability, we need to speak of cultural time—the Paleolithic age.
  • Sexual selection is a fascinating subject, but I see no need to invoke it to explain the increase in hominid brain size from the start of the Paleolithic to the rise of anatomically modern humans.

A Timeline Of Facts, A Narrative To Join Them

The factual knowledge we have about human behavior (including diet) during the Pleistocene is limited by the physical evidence we’ve discovered so far—which becomes thinner the farther back in time we go. Therefore, any narrative we construct from these facts must necessarily remain contingent on future discoveries.

However, the evidence we have strongly supports the currently accepted hypothesis for the evolution of human intelligence. I’ll do my best to compress several semesters of anthropology and evolutionary theory into a timeline that tells our ancestors’ story.

First, a key concept: in order to explain a more than tripling of brain size over nearly 3 million years, a single event is not sufficient. It’s not enough to say “Hunting is hard, so we had to get smarter.” We must postulate a sequence of events—one which creates the most parsimonious narrative from the physical evidence.

“Parsimonious” means “stingy” or “frugal”. It is frequently used by scientists as part of the phrase “the most parsimonious hypothesis/theory/explanation”, which means “the explanation which requires the least speculation and depends on the fewest unknowns.” (Also see: Occam’s razor.)

Before we start our narrative, we must define one more term: optimal foraging theory.

Optimal Foraging Theory

Optimal foraging theory (OFT) is a simple concept: “…Decisions are made such that the net rate of energy capture is maximized.” (Sheehan 2004)

This is because efficiency—obtaining more food for less effort—is rewarded by natural selection. Efficient foragers survive better during difficult times, and they spend less time exposed to the risks of foraging. This leaves them more likely to survive, and with more time to seek mates, raise offspring, or simply rest.

In the simplest case, herbivores select the most nutritious plants, and predators select the fattest, slowest herbivores. However, many complicated behaviors result from application of this simple rule. Two examples: for herbivores, leaving the herd costs energy and makes being eaten by a carnivore more likely; for predators, unsuccessful hunts cost energy and make starvation more likely.

Due to time and space constraints, we’re barely scratching the surface of OFT. This article provides a brief introduction, and Wikipedia goes into more detail—including many refinements to the basic model. For an in-depth exploration, including several interesting and complex behaviors resulting entirely from its real-world application, read this textbook chapter (PDF).

The result of OFT is, as one might hope, common sense: our ancestors would have eaten the richest, most accessible foods first.

Our Story Begins On Two Legs: Ardipithecus ramidus

Our story begins in an African forest during the Pliocene epoch, 4.4 million years ago. (Our ancestors have already parted ways with the ancestors of chimpanzees and bonobos. This occurred perhaps 6.5 MYA, in the late Miocene.)

The Miocene epoch lasted from 23 MYA to 5.3 MYA. The Pliocene epoch lasted from 5.33 to 2.59 MYA, and the Pleistocene lasted from 2.59 MYA to 11,700 BP.

It’s important to note that many different hominins existed throughout the Pliocene and Pleistocene. We aren’t absolutely certain which were directly ancestral to modern humans, and which represented stem species that subsequently died out…but the fossil record is complete enough that we’re unlikely to dig up anything which radically changes this narrative.

Though there are fascinating fossil finds which date even earlier (e.g. Orrorin), we’ll begin with Ardipithecus ramidus, a resident of what is now Ethiopia in the mid-Pliocene, 4.4 MYA. Today it’s the Afar desert—but in the Pliocene, its habitat was a lush woodland which occasionally flooded.

What Ardipithecus ramidus might have looked like. Click the picture for a BBC article.

“Ardi” was about four feet tall, with a brain the size of a modern chimpanzee (300-350cc). She was most likely what we call a facultative biped, meaning that she walked on four legs while in trees, and on two legs while on the ground: though her pelvis was adapted to walking upright, her big toe was still opposable and she had no arches, leaving her feet better adapted to gripping trees than to walking or running.

You can learn much more about Ardi at Discovery.com’s extensive and informative (though Flash-heavy and somewhat hyperbolic) website. For those with less patience or slow Internet connections, this NatGeo article contains a discussion of Ardi’s importance and possible means of locomotion. (Warning: both contain some highly speculative evolutionary psychology.)

From the evidence, we know that there must have been selection pressure to sacrifice tree-climbing ability in exchange for improved bipedal locomotion—most likely due to an increased ability to take advantage of ground-based foods. Though evidence is thin, its discoverers think (based on its teeth) that Ardi consumed a similar diet to its successor Australopithecus anamensis—nuts, root vegetables, insects, mushrooms, and some meat. (This supports the idea that Ardi ate more ground-based food, such as root vegetables and mushrooms, and less tree-based food, such as fruit.) And stable isotope analysis of its tooth enamel confirms that Ardipithecus was a forest species, lacking significant dietary input from grasses or animals that ate grasses.

Fruit Is For The Birds (And The Bats, And The Chimps): Australopithecus anamensis

Our next data point comes just a few hundred thousand years later.

“Early Humans Skipped Fruit, Went for Nuts”
Discovery News, November 9, 2009

Macho and colleague Daisuke Shimizu analyzed the teeth of Australopithecus anamensis, a hominid that lived in Africa 4.2 to 3.9 million years ago.

Based on actual tooth finds, Shimizu produced sophisticated computer models showing multiple external and internal details of the teeth. One determination was immediately clear: Unlike chimpanzees, which are fruit specialists, the hominid couldn’t have been much of a fruit-lover.

“Soft fleshy fruits tend to be acidic and do not require high bite forces to be broken down,” explained Macho. “The enamel microstructure of A. anamensis indicates that their teeth were not well equipped to cope with acid erosion, but were well adapted to masticate an abrasive and hard diet.”

The researchers therefore believe this early human ate nuts, root vegetables, insects—such as termites—and some meat. While they think certain flowering plants known as sedges might have been in the diet, Lucy and her relatives were not properly equipped for frequent leaf-chewing.

(Hat tip to Asclepius for the reference.)

Here’s the original paper:

Journal of Human Evolution Volume 57, Issue 3, September 2009, Pages 241–247
Dietary adaptations of South African australopiths: inference from enamel prism attitude
Gabriele A. Macho, Daisuke Shimizu

Unfortunately, as all we have yet found of Australopithecus anamensis are pieces of a jawbone and teeth, a fragment of a humerus, and a partial tibia (and those not even from the same individual!) we don’t know its cranial capacity. We do know that its range overlapped that of Ardipithecus—but since remains have also been found in transitional environments, it may have not been a pure forest-dweller.

Either way, it appears that our ancestors had been selected away from a fruit-based diet, and towards an omnivorous diet more compatible with savanna-dwelling, even before they left the forest.

Our Story Continues…With Footprints

This brings us to an unusual fossil find…the Laetoli footprints, left in volcanic ash 3.7 MYA, cemented by rainfall, and preserved by subsequent ashfalls. Their form and spacing shows that the hominins who made them were fully bipedal: their feet had arches and an adducted big toe, and they walked at or near human walking speed.

A footprint at LaetoliExcavating the Laetoli footprints

“Adducted” means “closer to the midline”. It means their big toe was close to their other toes, like a modern human—quite unlike the widely spaced, opposable big toe of Ardipithecus.

And though we’re not completely sure, it is generally accepted that the footprints were made by Australopithecus afarensis, the next player in our story. Here’s the original paper by Leakey and Hay, for those interested:

Nature Vol. 278, 22 March 1979, pp. 317-323
Pliocene footprints in the Laetolil Beds at Laetoli, northern Tanzania
Leakey, M. D. and Hay, R. L.

In summary, it’s clear from what we know of Ardipithecus, and Australopithecus anamensis, that bipedalism long preceded our ancestors’ move into savanna and grassland habitats. This makes sense: a clumsily-waddling knuckle-walker would stand no chance outside the safety of the forest, whereas a bipedal ape can survive in the forest so long as it retains some ability to climb trees—a talent even humans haven’t completely lost.

Furthermore, our dietary shift towards ground-based foods, and away from fruit, also preceded our ancestors’ move into savanna and grassland habitats.

Finally, and most importantly, both of these changes preceded the massive increase in our ancestors’ brain size.

Live in freedom, live in beauty.

JS

Continue to Part IV, “The Paleo Diet For Australopithecines”!

(Or, go back to Part I or Part II.)


(Yes, this series has expanded far beyond my original expectations! Frankly, the subject requires it…and you’re saving thousands of dollars in post-secondary tuition, so buck up and buy a T-shirt or something.)

Big Brains Require An Explanation, Part II: Sexual Selection, and What Does “Paleolithic” Mean, Anyway?

Upon writing Part I of this article, it expanded to two parts…and now it’s expanded to three parts! So if an issue you were hoping to learn about hasn’t yet been covered, rest assured I’ll get to it.

(Or, go back to Part I.)

Let’s Get Oriented In Time: What Does “Paleolithic” Mean?

Since we’ve talking about the "paleo diet" for years, and this series explores the increased brain size and behavioral complexity that took place during the Paleolithic, I think it’s important to understand exactly what the term “Paleolithic” means. Yes, everyone knows that it happened a long time ago—but how long? And how is the Paleolithic different from the Pleistocene? What do all these terms mean, anyway?

First, Some Common Archaeology Terms And Abbreviations

BP = years Before Present. “The artifact was dated to 6200 BP.”
KYA (or ka) = thousands of years Before Present. “The bones were dated to 70 KYA.”
MYA (or ma) = millions of years Before Present. “The Permo-Triassic extinction occurred 250 MYA.”
industry = a technique that produced distinct and consistent tools throughout a span of archaeological time. Examples: the Acheulean industry, the Mousterian industry.

Oldowan choppers

They don't look like much—but they were much better than fingernails or teeth at scraping meat off of bones.


The word itself is a straightforward derivation from Greek. “Paleo-” means “ancient”, and “-lithic” means “of or relating to stone”, so “Paleolithic” is just a sophisticated way to say “old rocks”. Its beginning is defined by the first stone tools known to be made by hominids, dated to approximately 2.6 MYA—the Oldowan industry—and it ends between 20,000 and 5,000 BP, with technology generally agreed to be transitional towards agriculture (the “Mesolithic” industries).

The Paleolithic age is further divided:

  • Lower Paleolithic: 2.6 MYA – 300 KYA. Defined by the Oldowan and Acheulean industries.
  • Middle Paleolithic: 300 KYA – 30 KYA. Defined primarily by the Mousterian and Aterian industries.
  • Upper Paleolithic: 50 KYA – between 20 and 5 KYA. Defined by a host of complex industries.
  • (Click here for more information, including links to all the above terms.)

The reason for the imprecise ending of the Upper Paleolithic (and the overlap between Paleolithic stages) is not because there is doubt about the dates of such recent artifacts…it is because the Paleolithic is a technological boundary, not a temporal boundary, and is defined by the suite of tools in use. So for the first cultures to transition towards agriculture, the Paleolithic ended approximately 20 KYA (and was succeeded by the Mesolithic), whereas other cultures used Paleolithic technology until perhaps 5000 BP.

It’s also important to keep in mind that there are continuing definitional squabbles, particularly with the Mesolithic and Neolithic. What constitutes a Mesolithic culture vs. an Epipaleolithic culture? If a culture never takes up farming, is it still Neolithic if it uses similar tools and technology?

I don’t like to spend too much time in this morass, because it’s not an interesting argument—it’s just a failure to agree on definitions. However, it is always true that Paleolithic cultures were hunter-gatherers. Furthermore, it is almost always true that Neolithic cultures were farmers. (There are a few cases where nomadic cultures adopted Neolithic technology, such as pottery.)

So when we are speaking of a “Paleolithic diet”, we are speaking of a diet nutritionally analogous to the diet we ate during the Paleolithic age—the age during which selection pressure caused our ancestors to evolve from 3’6″, 65# australopithecines with 400cc brains into tall, gracile, big-brained, anatomically modern humans with 1400cc brains. (A figure which has decreased by roughly 10% during the last 5000 years.)

No, we can’t just ‘eat like a caveman’: the animals are mostly extinct and the plants have been bred into different forms. I discuss the issue at length in this article: The Paleo Identity Crisis: What Is The Paleo Diet, Anyway?

Now Let’s Orient Ourselves In Geological Time

In contrast to archaeological ages, the Pleistocene is a geological term (an “epoch”), defined precisely in time as beginning 2.588 MYA and ending 11,700 BP. It’s preceded by the Pliocene epoch (5.332 to 2.588 MYA) and followed by the Holocene epoch (11,700 BP – present).

You’ll see a lot of sources that claim the Pleistocene began 1.6 or 1.8 MYA. This is because the definition was changed in 2009 to its present date of 2.588 MYA, so as to include all of the glaciations to which I referred in Part I.

(More specifically, geological time divisions are defined by a “type section”, which is a specific place in a specific rock formation, and which is dated as precisely as possible given available technology.)

Remember, these are all just names…changing the name doesn’t alter the events of the past.

To give some idea of the time scales involved, our last common ancestor with chimps and bonobos lived perhaps 6.5 MYA, the dinosaurs died out 65.5 MYA, and Pangaea broke up 200 MYA.

Note that the middle timeline of the illustration below zooms in on the end of the top timeline, and the bottom timeline zooms in on the end of the middle timeline. Also note that the time period we’re exploring takes up one tiny box in the lower right, so small that the word “Pleistocene” doesn’t even fit inside it!

Geological timeline of the Earth, from The Economist

Click the image for a larger and more legible version, and an interesting article from The Economist.

For a slightly deeper look into the significance of each geological period, I highly recommend you click here for a graphical, interactive timeline. And here’s a long explanation of the terminology: ages, epochs, eons, and so on.

Summary: Paleolithic or Pleistocene?

The Paleolithic began approximately 2.6 MYA, with the first known stone tools, and ended between 20 KYA and 5 KYA, depending on when the local culture adopted a Mesolithic or Neolithic industry. Since it’s defined by our knowledge of hominid tool use, these dates could change in the future.

The Pleistocene began exactly 2.588 MYA and ended 11,700 BP. These dates are defined by our best estimates of the age of two specific pieces of rock (or ice) somewhere on the Earth.

So though the two terms are measuring nearly identical spans of time, they’re defined by two completely different phenomena…and since we’re speaking of human development, it is appropriate to use the term defined by human artifacts—the Paleolithic age.

Did Sexual Selection Drive The Australopithecus -> Homo Transition?

Evolutionary psychology is great fun to read about…but the problem with extrapolating it back into the Lower and Middle Paleolithic is that it’s pure speculation. The entire fossil record of this era of hominids can be itemized on one Wikipedia page, and I think it’s extremely risky to draw behavioral conclusions so far beyond the physical evidence.

More importantly, though, it’s unnecessary to invoke sexual selection in order to explain the growth in human brain size.

“Even if the survivalist theory could take us from the world of natural history to our capacities for invention, commerce, and knowledge, it cannot account for the more ornamental and enjoyable aspects of human culture: art, music, sports, drama, comedy, and political ideals.”
-Geoffrey Miller, “The Mating Mind”

While this may very well be true, the first known archaeological evidence of art (blocks of ocher engraved with abstract designs) is dated to just 75,000 years ago, at Blombos Cave in South Africa—long after our ancestors first became anatomically modern c. 200,000 years ago. (Venus figurines are much more recent: the earliest is dated to 35 KYA.)

The first known art: carved red ocher

Click the image for more information about Blombos Cave.


The term “anatomically modern humans” refers to ancestral humans whose remains fall within the range of variations exhibited by humans today. We refer to such humans as the subspecies Homo sapiens sapiens.

Note that as with all fossil classifications, “anatomically modern” is a judgment call. There was no instant transition: a beetle-browed, heavy-limbed, archaic Homo sapien did not suddenly gave birth to Salma Hayek, and there are indeed many transitional fossils with a mix of archaic and modern features, usually known as “Early Modern Humans”.

Furthermore, the behavior of the few remaining African hunter-gatherer tribes, such as the Hadza and the Ju/wasi, supports the interpretation that sexual selection simply reinforced the same selection pressures as natural selection:

Human Nature 15:364-375.
Mate Preferences Among Hadza Hunter-Gatherers
Frank W. Marlowe

“Women placed more value on men being good foragers (85% of those women said “good hunter”) than on any other trait.”

National Geographic, December 2009
“The Hadza”
Michael Finkel

“Onwas joked to me that a Hadza man cannot marry until he has killed five baboons. […] Ngaola is quiet and introspective and a really poor hunter. He’s about 30 years old and still unmarried; bedeviled, perhaps, by the five-­baboon rule.

The Old Way: A Story Of The First People
Elizabeth Marshall Thomas

“A young man may not marry until he has killed a big game animal (preferably a large antelope, although a duiker or a steenbok will also suffice) and proved himself a hunter.”
     …
“His [/Gunda’s] victim had been only a duiker, but a duiker is plenty big enough to qualify a boy for marriage.
     …
“He [≠Toma] had few living relatives and no close ones, and thus could offer her no in-laws who could help her if the need arose, but he was an excellent hunter. This would appeal to any girl. So !U nagged her parents until they consented to the marriage.

In conclusion: the evidence is that sexual selection, if it was an important force, was providing the same selection pressure as natural selection—and that the behaviors most attributed to sexual selection postdate our evolutionary transformation into anatomically modern humans. Furthermore, it seems prudent not to invoke a factor for which our evidence is entirely speculative when there are other factors sufficient to explain our ancestors’ transformation.

Therefore, while sexual selection is a fascinating subject worthy of discussion, I don’t see a need to invoke it as a separate force to explain the increase in hominid brain size and behavioral complexity from the beginning of the Paleolithic (2.6 MYA) to the time of anatomically modern humans (200-100 KYA).

Live in freedom, live in beauty.

JS

Continue to Part III, in which we explain Optimal Foraging Theory and begin the story of our ancestors.

(Or, go back to Part I.)


To my US readers: may I request that you make your Amazon.com purchases through this link (or the link in the sidebar)? It costs you nothing, and the small extra spiff helps me keep gnolls.org updated and ad-free. Thank you!

And yes, you can buy anything you want after you click on the link—not just The Gnoll Credo.