Remember back when “Are white potatoes paleo?” was the biggest question facing the paleo community? Now we’re seeing perfectly respectable paleo bloggers advocating butter and heavy cream…and some are even experimenting with white rice.
What sort of caveman diet is that?
And just what is the “paleo diet”, anyway? Is the term becoming diluted because we just can’t stop eating delicious cheat foods—or is it still a valid concept?
First, we need to define the paleo diet. Here’s one attempt, which I’ve chosen because it’s typical:
“With readily available modern foods, The Paleo Diet mimics the types of foods every single person on the planet ate prior to the Agricultural Revolution (a mere 500 generations ago).” -Dr. Loren Cordain, “The Paleo Diet”
This is a simple and concise definition, and it avoids the common pitfall of “eat only what cavemen ate”: we hunted mammoths, mastodons, gomphotheres, glyptodonts, wisents, and most other megafauna to extinction, so “what cavemen ate” is no longer an option. And the fruits and vegetables found in a supermarket have little to do with ancestral, wild varieties: they’re products of agricultural domestication. Paleolithic humans didn’t wander around the African savanna picking Paleolithic broccoli.
Unfortunately, this definition is also misleading. “Every single person on the planet” didn’t eat the same diet during the more recent times (Upper Paleolithic, 10,000-40,000 years ago) for which we have good evidence—and during earlier times (Lower/Middle Paleolithic, 2.6 million-140,000 years ago), the direct evidence is too sparse to make specific recommendations.
Clearly we must do better.
Surviving Lean Times Is What Defines Us
The ability to survive lean times is what defines a long-lived, slowly-reproducing species like humans. It doesn’t matter how successful we are during good times if bad times kill everyone off.
Even if we thrived during “normal” years, but 90% of us died off during each 100-year drought, we would have gone extinct long ago—because hunter-foragers don’t reproduce quickly enough to grow their population ten-fold in 100 years. Even a terrible disaster that happens once every thousand years is basically a continual condition on the scale of millions of years of hominid evolution.
Nomadic hunter-foragers are hard-pressed to have more than one child every four years or so. Lacking blenders and convenient jars of pablum, hunter-forager children are typically breastfed for years; it’s very difficult to care for a second child until the first is big enough to keep up with the tribe on foot; and it takes a long time for the mother to build her nutritional reserves back up.
In contrast, one mosquito can lay thousands of eggs in just a few weeks, and bacterial generations are often measured in minutes.
Humans can’t depend purely on genetic selection to adapt us to droughts, floods, volcanic eruptions, hard winters, prey dieoffs, and other dramatic, short-term crises: each of us, as individuals, must change our behavior to survive the situation.
Adaptability: Why Intelligence And Omnivory Are Both Important
I’ve made the point before that general-purpose intelligence is selected for during times of change, which basically defines the entire Pleistocene…a repeating cycle of glaciation and warming that caused sea levels to fluctuate by over 150 feet and pushed ice sheets down to what is now southern Illinois. The combination of tool-using and general-purpose intelligence has allowed hominids to adapt to a bewildering variety of ecological niches, from the high Arctic to the jungles of Central America to the deserts of Arabia to the endless grasslands of the Great Plains.
However, intelligence does no good without the ability to digest and metabolize a variety of food sources. If mountain lions could speak, they still couldn’t live off of bamboo or eucalyptus leaves. If wildebeest could reason and make Gravettian stone tools, they still couldn’t live off of hunted meat.
Our Ancestral Diet: Just Because We Can And Did, Doesn’t Mean We Should
This poses an interesting question: which of our dietary adaptations simply allowed us to struggle through bad times, and which are our “ancestral diet”? To choose a modern example, humans can clearly survive and reproduce on a diet of donuts, Taco Bell, and Red Bull—but we all know such a diet isn’t optimal for health or long life.
To answer this, we need to ask a question: “How continual was this source of food during our evolutionary history?” If we only needed to eat something mildly poisonous (but not fatal) during the driest seasons of 500-year droughts, resistance to the poison might only have been weakly selected for—whereas tolerance to something we ate regularly would have been strongly selected for.
To choose one example, this is why a few thousand years of agriculture have only weakly and incompletely selected us for gluten tolerance: intolerance won’t kill you outright, and even celiac kills you very slowly.
Therefore, humans are likely to be well-adapted to dietary patterns for which we have frequent and robust evidence over a long span of evolutionary time.
The case for infrequent and weak evidence is less clear, because there are several possibilities. One, of course, is that we’ve simply not found very many such sites yet. Another, however, is that we’ve found evidence of crisis behavior: foods eaten only in extreme periods of hunger, and to which we’re not well adapted.
Try going without food for two days—if you can—and take a walk through the woods or your local park. I guarantee you’ll start wondering whether tree bark is edible, and if you can really catch those squirrels. Anyone living in a First World nation, and reading this article on their computer, is extremely unlikely to have a meaningful conception of hunger.
Then imagine what would happen if you had to fast for another day, or an entire week, and still maintain all your regular responsibilities. (Going on a retreat and sitting on a beach doesn’t count…and “juice fasts” aren’t fasting at all.) You’re going to eat anything that will fit in your mouth and doesn’t immediately kill you.
Another possibility is medicinal use: modern hunter-foragers collect a variety of plants that are never eaten, and which are only used occasionally in small quantities. And there is a final, more disturbing possibility: not every ancient group of hominids survived, and not all experiments are successful. Infrequently eaten foods could have been the last-ditch survival effort of a tribe that starved to death and left no descendants, or a failed experiment that slowly poisoned the tribe that depended on it. Human population was small, thinly distributed, and most branches of the hominid line went extinct. There’s no way, from looking at one single archaeological site, of knowing whether the remains came from successful or unsuccessful tribes or cultures.
For instance, sorghum residue in one cave, found 70,000 years previous to any other evidence of regular seed processing, could be a trace of a thriving culture of grass-eaters; it could be a temporary response to a drought or a crash in prey population; or it could be the final meals of a starving family. (“Early homo sapiens relied on grass seeds” is, in my opinion, a transparently silly assertion to make from such limited evidence.) And as Dr. Cordain points out in his response, there’s no evidence of all the other technologies necessary to make sorghum edible to humans. (Original paper, Dr. Cordain’s response.)
Be suspicious of these types of stories in the media, for reasons I outlined in last week’s article: in addition to turning “residue in one cave, with no evidence of cooking or other necessary dietary processing” to “CAVEMEN ATE BREAD!!11!!1!, these stories typically conflate “grains of starch” with “cereal grains”.
For some perspective on life in the wild during hard times, see the incredible National Geographic documentary “Last Feast of the Crocodiles”. It’s in four parts: here’s Part 1.
(I’ve linked you to Youtube because NatGeo has never released it on DVD, let alone released a downloadable or streamable version. If they ever make it available for sale again, I’m glad to link to that.)
Dietary Conclusions From Archaeology: Not As Robust As We Might Hope
In order to claim that archaeological evidence represents typical human behavior, or that its remains are representative of an ancestral diet to which we are well adapted, we need robust evidence throughout the time when selection pressure was shaping hominids into anatomically modern humans, but before we spread out from Africa—approximately 2.6 Mya to 65 Kya.
Stone tools are found in profusion, all throughout the Paleolithic: first the Oldowan industry, which are just round, easily grippable rocks with an edge smashed into one side. Then came the Acheulean biface industry, which lasted for well over a million years. Then the Mousterian and Aurignacian industries, and the microlithic technologies that allowed hunting with spears and projectiles…
…and from their earliest traces 2.6 million years ago, at Bouri and Gona, through to the present, they have been frequently associated with cutmarked animal bones, and frequently feature wear patterns consistent with skinning and butchering of game.
So the evidence for consumption of meat (and its associated fat) is robust. The evidence for eating anything else is relatively indirect, since plant matter doesn’t tend to fossilize, and we’re generally limited to inference based on things like tooth shape, jaw musculature, estimations of local weather and climate, and unambiguous evidence of controlled fire and hearths. Furthermore, evidence of any kind is extremely thin the farther we go back in time: entire ancestral hominid species are implied by a few reassembled bone and skull fragments.
In conclusion: archaeology tells us that the ancestral hominid diet involved cutting meat off of bones and eating it. Beyond that, we’re a bit foggy on the details until we get into the Upper Paleolithic, where we can perform isotopic analysis of proteins, analyze plant residues, and run DNA analyses that connect the sites to modern populations of which we have historical knowledge.
And during the Upper Paleolithic, humans expanded to inhabit such a wide range of environments, from Siberia to the African rainforest, that the concept of “what Paleolithic humans ate” is dismayingly broad.
Direct Evidence: The Takeaway
- Beyond meat, we don’t know that much about what Lower and Middle Paleolithic humans really ate from day to day. Direct archaeological evidence is extremely thin until perhaps 140,000 years ago.
- The evidence in the Upper Paleolithic (40,000 years ago and newer), and from the few remaining Neolithic hunter-foragers, doesn’t point to a single “paleo diet”: it only allows us to speak of “a paleo diet”.
- It’s impossible to recreate historical paleo diets anyway, because most of the meat animals are extinct, and the plants commonly available to us have all been domesticated.
Fun fact: Herd animals were domesticated long before cabbage was bred into its modern forms. Therefore, humans have been drinking milk for longer than we’ve been eating broccoli!
In short, it’s clear that the concept of “paleo re-enactment” has just been triangle-choked into unconsciousness.
Why Call It Paleo, Then?
We call it “paleo” for the same reason that we call it “Latin”, even though we have absolutely no idea how it was spoken. Just as Latin scholars attempt to maintain something syntactically analogous to written Latin, paleo dieters attempt to maintain something nutritionally analogous to an ancestral human diet.
This is where we have to start using science to draw tentative conclusions from the evidence we have. And while it’s tempting to get into speculative arguments about human prehistory, at the end of the day, we have to ask ourselves: What is the biochemistry of humans? How does human metabolism work, today, right now?
So it is absolutely valid to question whether strictly Neolithic foods, such as butter or rice, have a place in the paleo diet. Eating butter because it’s nutritionally similar to animal fat is no different than wearing clothes you bought at the store because they’re functionally similar to animal skins.
That is why the paleo community is asking all these questions about clearly neolithic foods. Should we eat butter or cream? Should we eat white potatoes or white rice? And do snow peas really count as a legume?
This Is Not N=1/”Whatever Works For You”
There is an important difference between “We don’t know all the answers yet” and “Do what feels right, man.” These questions have answers, because humans have biochemistry, and we should do our best to find them and live by the results. Oreos are delicious, but there’s no contingency by which they’re even remotely paleo.
Wrapping It Up: Is There A Definition Of The Paleo Diet?
Here is my best attempt at a definition. If you can improve it or think of a better one, leave a comment!
A paleo diet is:
- Eating foods that best support the biochemistry of human animals with a multi-million year history of hunting and foraging, primarily on the African savanna.
- Avoiding foods, such as grains, grain oils, and refined sweeteners, that actively disrupt the biochemistry of these human animals.
I call this approach "functional paleo".
Live in freedom, live in beauty.
“A fantastic read.”
“Packs more thought-provoking ideas and emotion-wrenching fun into its brief pages than many novels five times as long.”
“Funny, Provocative, Entertaining, Fun, Insightful”
“Thank you so much for writing this utterly amazing, mind opening, and fantastically beautiful book.”
“The short review is this – Just read it.”
-reviews of The Gnoll Credo
Postscript: As several commenters have noted, there’s a useful intermediate step that involves only the second part of my definition, “avoiding foods … that actively disrupt the biochemistry of these animals”. Usually this means going gluten-free, sugar and HFCS-free, and eliminating heavily processed snack foods, i.e. Sean Croxton’s JERF. Often it’s softened to “minimizing foods …”: an example would be the WAPF, which advocates sprouting grains and beans to reduce their toxin and antinutrient load.