My series-in-progress, Why Are We Hungry? (part II, part III), will return after the Ancestral Health Symposium. I’m anxious to continue, because we’re just starting to dig into the meat of the problems with Part III)—but due to conference preparations and unexpected workspace issues, I simply don’t have the time to do it justice right now.
I’ll be signing copies of The Gnoll Credo at the Friday evening author event—so if you’re attending AHS, make sure to stop by and introduce yourself!
Also, if you don’t see me update next Tuesday, rest assured I’m on a mountain somewhere in the Eastern Sierra.
For reasons I’ve explained at length in What Is The Paleo Diet, Anyway?, I believe the term “paleo” is sufficient to encompass the entire online and print community. The foundation of a paleo diet is our multi-million year evolutionary history as hunters and foragers, and we all understand that humans are poorly adapted to a diet of grass seeds—which we’ve been eating for perhaps a few thousand years.
However, though we as individuals can choose any point on the continuum, the print and online literature divides itself reasonably cleanly into two schools of thought. The traditionalists emphasize re-enactment of their perception of Paleolithic foods, make very specific claims about Paleolithic nutritional composition, and stress avoidance of all foods they view as Neolithic. In contrast, the new school claims that re-enactment is impossible, many claims of Paleolithic nutritional composition are either unsupported or implausible, and that we must evaluate foods, even clearly Neolithic foods, on their nutritional merits to present-day humans—though within our evolutionary context.
I believe we need a simple, descriptive term that distinguishes the pro-fat, dairy- and potato-tolerant “new school” of Paleo from the lean-meats-nuts-and-veggies traditionalists, without being pejorative to either.
To that end, I propose the term “functional paleo” to describe the new school.
Eating foods that best support the biochemistry of human animals with a multi-million year history of hunting and foraging, primarily on the African savanna.
Avoiding foods, such as grains, grain oils, and refined sweeteners, that actively disrupt the biochemistry of these human animals.
In other words, “functional paleo” is based on the biochemical function of food within the human body. It is informed by evolutionary context, but not limited by it. (Or, most likely, a contested interpretation of it.)
This functional definition carries its own risk: we can mistakenly see “food” as a collection of nutrients, an approach that ignores the many constituents of Real Food (meat, eggs, vegetables, root starches, fruit and nuts) that haven’t yet been isolated, recognized, or classified as nutritional. That way lies “meal replacement shakes” and madness—and that is why we must keep our evolutionary context in sight.
Questions like “What would Grok do?” and “Imagine yourself in the woods, or by the ocean or on some fertile plain, with nothing but your own wit. What would you be able to eat?” are mental shortcuts to evolutionary context.
However, the functional definition allows us to avoid silly arguments like “Paleolithic humans regularly ate rotten meat, so why don’t you?” and “An archeologist found starch residue in one cave in Africa, so that means cavemen ate bread and grains are paleo.” It also allows us to understand that though nuts and honey were certainly consumed in the Paleolithic, that fact alone doesn’t make them healthy to eat—especially in large quantities. I find this to be a worthy tradeoff, and I hope others agree.
Functional Paleo: Who’s In?
Here is a non-exhaustive list of sources I consider to be “functional paleo”.Please let me know if I’ve missed you or miscategorized you, or anyone else: I’ve erred on the side of caution by not mentioning any source I’m not reasonably sure of. (Leave a comment, or contact me directly.)
Remember back when “Are white potatoes paleo?” was the biggest question facing the paleo community? Now we’re seeing perfectly respectable paleo bloggers advocating butter and heavy cream…and some are even experimenting with white rice.
What sort of caveman diet is that?
And just what is the “paleo diet”, anyway? Is the term becoming diluted because we just can’t stop eating delicious cheat foods—or is it still a valid concept?
First, we need to define the paleo diet. Here’s one attempt, which I’ve chosen because it’s typical:
“With readily available modern foods, The Paleo Diet mimics the types of foods every single person on the planet ate prior to the Agricultural Revolution (a mere 500 generations ago).” -Dr. Loren Cordain, “The Paleo Diet”
They probably tasted delicious, too: otherwise we wouldn't have hunted them to extinction.
Unfortunately, this definition is also misleading. “Every single person on the planet” didn’t eat the same diet during the more recent times (Upper Paleolithic, 10,000-40,000 years ago) for which we have good evidence—and during earlier times (Lower/Middle Paleolithic, 2.6 million-140,000 years ago), the direct evidence is too sparse to make specific recommendations.
Clearly we must do better.
Surviving Lean Times Is What Defines Us
The ability to survive lean times is what defines a long-lived, slowly-reproducing species like humans. It doesn’t matter how successful we are during good times if bad times kill everyone off.
Even if we thrived during “normal” years, but 90% of us died off during each 100-year drought, we would have gone extinct long ago—because hunter-foragers don’t reproduce quickly enough to grow their population ten-fold in 100 years. Even a terrible disaster that happens once every thousand years is basically a continual condition on the scale of millions of years of hominid evolution.
Nomadic hunter-foragers are hard-pressed to have more than one child every four years or so. Lacking blenders and convenient jars of pablum, hunter-forager children are typically breastfed for years; it’s very difficult to care for a second child until the first is big enough to keep up with the tribe on foot; and it takes a long time for the mother to build her nutritional reserves back up.
In contrast, one mosquito can lay thousands of eggs in just a few weeks, and bacterial generations are often measured in minutes.
Humans can’t depend purely on genetic selection to adapt us to droughts, floods, volcanic eruptions, hard winters, prey dieoffs, and other dramatic, short-term crises: each of us, as individuals, must change our behavior to survive the situation.
Adaptability: Why Intelligence And Omnivory Are Both Important
However, intelligence does no good without the ability to digest and metabolize a variety of food sources. If mountain lions could speak, they still couldn’t live off of bamboo or eucalyptus leaves. If wildebeest could reason and make Gravettian stone tools, they still couldn’t live off of hunted meat.
Our Ancestral Diet: Just Because We Can And Did, Doesn’t Mean We Should
This poses an interesting question: which of our dietary adaptations simply allowed us to struggle through bad times, and which are our “ancestral diet”? To choose a modern example, humans can clearly survive and reproduce on a diet of donuts, Taco Bell, and Red Bull—but we all know such a diet isn’t optimal for health or long life.
To answer this, we need to ask a question: “How continual was this source of food during our evolutionary history?” If we only needed to eat something mildly poisonous (but not fatal) during the driest seasons of 500-year droughts, resistance to the poison might only have been weakly selected for—whereas tolerance to something we ate regularly would have been strongly selected for.
To choose one example, this is why a few thousand years of agriculture have only weakly and incompletely selected us for gluten tolerance: intolerance won’t kill you outright, and even celiac kills you very slowly.
Therefore, humans are likely to be well-adapted to dietary patterns for which we have frequent and robust evidence over a long span of evolutionary time.
The case for infrequent and weak evidence is less clear, because there are several possibilities. One, of course, is that we’ve simply not found very many such sites yet. Another, however, is that we’ve found evidence of crisis behavior: foods eaten only in extreme periods of hunger, and to which we’re not well adapted.
Try going without food for two days—if you can—and take a walk through the woods or your local park. I guarantee you’ll start wondering whether tree bark is edible, and if you can really catch those squirrels. Anyone living in a First World nation, and reading this article on their computer, is extremely unlikely to have a meaningful conception of hunger.
Then imagine what would happen if you had to fast for another day, or an entire week, and still maintain all your regular responsibilities. (Going on a retreat and sitting on a beach doesn’t count…and “juice fasts” aren’t fasting at all.) You’re going to eat anything that will fit in your mouth and doesn’t immediately kill you.
Another possibility is medicinal use: modern hunter-foragers collect a variety of plants that are never eaten, and which are only used occasionally in small quantities. And there is a final, more disturbing possibility: not every ancient group of hominids survived, and not all experiments are successful. Infrequently eaten foods could have been the last-ditch survival effort of a tribe that starved to death and left no descendants, or a failed experiment that slowly poisoned the tribe that depended on it. Human population was small, thinly distributed, and most branches of the hominid line went extinct. There’s no way, from looking at one single archaeological site, of knowing whether the remains came from successful or unsuccessful tribes or cultures.
For instance, sorghum residue in one cave, found 70,000 years previous to any other evidence of regular seed processing, could be a trace of a thriving culture of grass-eaters; it could be a temporary response to a drought or a crash in prey population; or it could be the final meals of a starving family. (“Early homo sapiens relied on grass seeds” is, in my opinion, a transparently silly assertion to make from such limited evidence.) And as Dr. Cordain points out in his response, there’s no evidence of all the other technologies necessary to make sorghum edible to humans. (Original paper, Dr. Cordain’s response.)
For some perspective on life in the wild during hard times, see the incredible National Geographic documentary “Last Feast of the Crocodiles”. It’s in four parts: here’s Part 1.
(I’ve linked you to Youtube because NatGeo has never released it on DVD, let alone released a downloadable or streamable version. If they ever make it available for sale again, I’m glad to link to that.)
Dietary Conclusions From Archaeology: Not As Robust As We Might Hope
In order to claim that archaeological evidence represents typical human behavior, or that its remains are representative of an ancestral diet to which we are well adapted, we need robust evidence throughout the time when selection pressure was shaping hominids into anatomically modern humans, but before we spread out from Africa—approximately 2.6 Mya to 65 Kya.
Stone tools are found in profusion, all throughout the Paleolithic: first the Oldowan industry, which are just round, easily grippable rocks with an edge smashed into one side. Then came the Acheulean biface industry, which lasted for well over a million years. Then the Mousterian and Aurignacian industries, and the microlithic technologies that allowed hunting with spears and projectiles…
…and from their earliest traces 2.6 million years ago, at Bouri and Gona, through to the present, they have been frequently associated with cutmarked animal bones, and frequently feature wear patterns consistent with skinning and butchering of game.
So the evidence for consumption of meat (and its associated fat) is robust. The evidence for eating anything else is relatively indirect, since plant matter doesn’t tend to fossilize, and we’re generally limited to inference based on things like tooth shape, jaw musculature, estimations of local weather and climate, and unambiguous evidence of controlled fire and hearths. Furthermore, evidence of any kind is extremely thin the farther we go back in time: entire ancestral hominid species are implied by a few reassembled bone and skull fragments.
Click for an article by the redoubtable John Hawks about these skeletons.
In conclusion: archaeology tells us that the ancestral hominid diet involved cutting meat off of bones and eating it. Beyond that, we’re a bit foggy on the details until we get into the Upper Paleolithic, where we can perform isotopic analysis of proteins, analyze plant residues, and run DNA analyses that connect the sites to modern populations of which we have historical knowledge.
And during the Upper Paleolithic, humans expanded to inhabit such a wide range of environments, from Siberia to the African rainforest, that the concept of “what Paleolithic humans ate” is dismayingly broad.
Direct Evidence: The Takeaway
Beyond meat, we don’t know that much about what Lower and Middle Paleolithic humans really ate from day to day. Direct archaeological evidence is extremely thin until perhaps 140,000 years ago.
The evidence in the Upper Paleolithic (40,000 years ago and newer), and from the few remaining Neolithic hunter-foragers, doesn’t point to a single “paleo diet”: it only allows us to speak of “a paleo diet”.
It’s impossible to recreate historical paleo diets anyway, because most of the meat animals are extinct, and the plants commonly available to us have all been domesticated.
Fun fact: Herd animals were domesticated long before cabbage was bred into its modern forms. Therefore, humans have been drinking milk for longer than we’ve been eating broccoli!
In short, it’s clear that the concept of “paleo re-enactment” has just been triangle-choked into unconsciousness.
Why Call It Paleo, Then?
We call it “paleo” for the same reason that we call it “Latin”, even though we have absolutely no idea how it was spoken. Just as Latin scholars attempt to maintain something syntactically analogous to written Latin, paleo dieters attempt to maintain something nutritionally analogous to an ancestral human diet.
This is where we have to start using science to draw tentative conclusions from the evidence we have. And while it’s tempting to get into speculative arguments about human prehistory, at the end of the day, we have to ask ourselves: What is the biochemistry of humans? How does human metabolism work, today, right now?
So it is absolutely valid to question whether strictly Neolithic foods, such as butter or rice, have a place in the paleo diet. Eating butter because it’s nutritionally similar to animal fat is no different than wearing clothes you bought at the store because they’re functionally similar to animal skins.
That is why the paleo community is asking all these questions about clearly neolithic foods. Should we eat butter or cream? Should we eat white potatoes or white rice? And do snow peas really count as a legume?
This Is Not N=1/”Whatever Works For You”
There is an important difference between “We don’t know all the answers yet” and “Do what feels right, man.”These questions have answers, because humans have biochemistry, and we should do our best to find them and live by the results. Oreos are delicious, but there’s no contingency by which they’re even remotely paleo.
Wrapping It Up: Is There A Definition Of The Paleo Diet?
Here is my best attempt at a definition. If you can improve it or think of a better one, leave a comment!
“A fantastic read.”
“Packs more thought-provoking ideas and emotion-wrenching fun into its brief pages than many novels five times as long.”
“Funny, Provocative, Entertaining, Fun, Insightful”
“Thank you so much for writing this utterly amazing, mind opening, and fantastically beautiful book.” “The short review is this – Just read it.”
-reviews of The Gnoll Credo
Postscript: As several commenters have noted, there’s a useful intermediate step that involves only the second part of my definition, “avoiding foods … that actively disrupt the biochemistry of these animals”. Usually this means going gluten-free, sugar and HFCS-free, and eliminating heavily processed snack foods, i.e. Sean Croxton’s JERF. Often it’s softened to “minimizing foods …”: an example would be the WAPF, which advocates sprouting grains and beans to reduce their toxin and antinutrient load.
Support gnolls.org by making your Amazon.com purchases through this affiliate link:
It costs you nothing, and I get a small spiff. Thanks! -JS
Subscribe to Posts
Gnolls In Your Inbox!
Sign up for the sporadic yet informative gnolls.org newsletter. Since I don't update every day, this is a great way to keep abreast of important content. (Your email will not be sold or shared.)
IMPORTANT! If you do not receive a confirmation email, check your spam folder.
Login/Register for the forums, and to leave comments that link to your website Register | Login