- 1. The Brain Got Bigger
- 2. The Megafaunal Extinction
- 3. Fat Adaption
- 4. Insulin Resistance
- 5. Energy Efficiency
- 6. Stomach Acidity
- 7. Omega 3
- 8. Shrinking Gut
- 9. A Smaller Jaw
- 10. Lack Of Dental Decay
- 11. Isotope Data
- 12. Hunting Tools
- 13. Plants Weren’t That Great Back Then
Why Does It Matter What Our Ancestors Ate?
In the fiery debate of plant vs animal-based nutrition, diets have been desperately trying to stake their claim on the evolutionary version. You’d be hard-pressed to find a common consensus on the contents of the ancestral diet. Some point to a Garden of Eden version, where early humans were practically vegans. Others argue we were almost exclusively meat-eaters. Those who like to sit on the balanced diet fence, make the case for general omnivory, the jack of all macros.
But does it really matter what our primal ancestors ate, or are we just fighting over an idea of heritage?
The hunter-gatherer (HG) diet, as it’s known, carries tremendous implications. Whereas modern culture revolves around everyone getting their own Cinderella story, the HG diet makes the case that there’s a species-appropriate template out there. We’re not clones, but neither are we unique enough to warrant so much polarity.
The fact is, the palaeolithic cuisine would have occupied a vast swathe of the human timeline. If you look back at our lineage, Homo Sapiens first appeared around 200,000 years ago. The agricultural revolution came into the picture around 10,000 BC, radically changing the diet by bringing grains into the mix.
That in itself shows where we would have gained the most adaptations across our evolution. Except we can then trace the ancestral line further back to 2 million years ago, when Homo Erectus began their existence on the African savannah.
This condenses the post-agricultural revolution to a mere blip on the human timeline. A significant blip, with a ton of seismic events, but not a span of time that could spoil the adaptations we gleaned from a few million years as hunter-gatherers.
The thing is, evolution is an incredibly slow process. That’s why we’re stuck with many defunct survival mechanisms from the Palaeolithic Age, such as the sensitivity to fight-or-flight stimuli. It may seem inconceivable that the brain hasn’t corrected itself over ten thousand years, but that’s the reality of evolution.
So there’s no reason the superfoods of the Pleistocene can’t be the superfoods of today. HG tribes still exist today, straggled out in isolated corners of the globe. They offer a fascinating insight into the original human diet, a little time capsule into our ancestral past.
But even they can be misleading, because there’s under a thousand of them left, and they better match up against the closing stages of the Pleistocene. We’ll be getting into the reasons soon enough.
If we could identify the foods we ate, plants or meat, over the pre-agricultural timeline, then we could make a real case for a species-appropriate diet. Foods we are built, evolved, to thrive on. The lingering HG tribes far outperform the health metrics of the standard office-dwelling westerner.
Don’t let stats like life expectancy fool you. Once you account for infant mortality, a stat that’s been decimated by modern medicine, the numbers are pretty even. Except the HG tribes don’t have chronic disease, they have great teeth, no weird body odours, and outstanding mental wellbeing. And they aren’t even the best example of the early human diet. So if our ancestors were sticking to certain foods, we might need to take notice.
Our Old Vegan Cousins
And with that, I’ll lay down 13 facts that outline why meat was always first on the menu. Not just any meat, the fattest cuts. The exact ingredient we’ve been schooled for decades to avoid, on a matter of life and death.
On a side note for a question that may have darted across your head, why draw the line at the dawn of Homo Erectus? That species evolved from Australopithecus, who were mostly on a plant-based diet, with a couple of lizards thrown in every now and then.
Australopithecus stretches the timeline back further to around 3.6 million years ago. But they weren’t a member of the Homo family, where records start with Homo Erectus. And more importantly, Australopithecus weren’t world-beaters. Just a collection of primates that were a long way down the food chain. They had smaller brains than us, and limited tool use. They just didn’t have the adaptations that the Homo did, which enabled the latter to reach the top of the ladder and wipe out most of the other carnivores.
It would be worth adding that there was a member of the Australopithecus line that lived alongside the Homo genus. Paranthropus Robustus lived from 2 million to 0.6 million years ago, and was characterised by huge jaws tailor-made for chomping down on tough, fibrous plants. Isotope data also supports the idea that our Paranthropus cousins were vegans.
Well, they went extinct, and the line fizzled to an end. While our ancestors went from strength to strength. So for these reasons, we might as well start the line of investigation with Homo. Spanning from the rise of Homo Erectus in 2 million years ago, and ending with the agricultural revolution around 10,000 BC.
1. The Brain Got Bigger
In relation to body size, a large brain in primates and humans is closely associated with high nutritional density. It doesn’t get denser than fats, and while there are some natural plant fats out there, they wouldn’t exactly have been easy to find in the African Savannah, or the Eurasian Steppe for that matter.
A larger brain would have facilitated cognitive adaptations, such as the increased use of tools for hunting. In fact, our brain size actually peaked during the Pleistocene, declining by its closing stages. But what could have possibly happened?
2. The Megafaunal Extinction
By all accounts, the first foray into plant domestication wouldn’t have been a fun experience. It might sound safer to pitch up a tent and produce crops that could provide a food source that doesn’t run away when approached. But it’s a lot of back-breaking work, with unsavoury health outcomes. And life would have become a whole lot more boring.
So why did our ancestors decide to ditch the nomadic life? Well, we brought in on ourselves by overhunting our primary food source. Woolly mammoths, giant ground sloths, fluffy rhinoceroses, and many more creatures of myth roamed the lands back then. Unfortunately, it doesn’t get more energy-dense than a fat-laden mammoth, so their luck was out when humans moved into the scene. Humans might have been a lot smaller than their prey, but they made up for it with sharp pointy objects, and by hunting in packs. In reality, it was a mismatch the megafauna couldn’t survive.
As we jumped from continent to continent, the megafauna died out in sync, heavily implying that we played a far bigger role in their extinction than climate change. The megafauna that did survive, like standard elephants, had their range drastically reduced. Humans couldn’t rely on big clumsy dinners anymore, and they had to drop down to smaller prey, like deer. Prey that was gamier, with less fat. It became hard to sustain the same way of life without starving to death.
And so, the agricultural revolution came into view. It made our health ten times worse, but we didn’t have much of a choice at that point.
3. Fat Adaption
One thing that defined the megafauna, was humongous fat reserves. A woolly mammoth could have had around 50% body-fat. Given that fat has the densest concentration of energy among macros, it makes sense that humans would have held fat above all other resources.
But it gets much deeper than that, and the topic of fat adaption probably warrants its own article. The reason we struggled with hunting smaller, leaner prey, was because protein absorption tends to cap off at 35% of daily calories. This is because the liver and kidneys are limited in their ability to remove the toxic nitrogen produced during protein metabolism.
All this meant our ancestors simply couldn’t get much energy from lean game. This rule of hunting is reflected by explorers on the American frontier, who suffered intense side effects when dining on rabbit. And those rabbits wouldn’t have been the plump domesticated kind.
Ketosis is a survival mechanism. In evolutionary terms, it had an incredibly practical use, that made sure it was used frequently. It creases energy efficiency, controls blood glucose, and supplies the body with fuel from its own fat stores.
It’s a mechanism to enhance fasting, and it would have enabled wandering tribes to survive for days and weeks without food. Which perfectly sets us up for the reality of hunting large prey. Sometimes, you spot a hippo in the distance, you win. Other times, you miss out, and you have to move on in search of another opportunity in fresh pastures.
Ketones themselves are basically broken down fat cells, a result of the body evolving specifically to get the most out of fat stores. You’re not going to find any similar mechanisms with glucose. But what you can get instead, is insulin resistance.
4. Insulin Resistance
Isn’t it a bad thing? If you caught my series on chronic disease, you would have seen pathological insulin resistance playing the role of one of the four horsemen of the metabolic epidemic. And in that situation, where the body is struggling to clear blood glucose, it’s certainly a problem. But we’re not looking at insulin resistance in its guise of diet dysfunction. Instead, it’s the evolutionary mechanism of physiological insulin resistance, which occurs in perfectly healthy specimens.
In this scenario, the body blocks the absorption of glucose by the peripheral tissues, like the muscles, in order to save it for the tissues that actually need it, the erythrocytes, central nervous system, and testes. Physiological insulin resistance exists to spare a resource that doesn’t arrive in huge quantities.
This is why a few bites of candy can produce a huge crash after spending time in ketosis. By all accounts, we weren’t built to function on glucose as a primary fuel source.
5. Energy Efficiency
This is a practical look at putting bread on the table. In the stone age, it was a need that surpassed everything else. Survival meant getting enough food to last the tribe through the week. Back before we came up factory farming, acquiring food was a slow and methodical process. In order to survive, our ancestors had to choose the resource that netted the most amount of calories in the shortest span of time.
That doesn’t mesh well with the prospect of fumbling through the undergrowth to pick up a handful of berries. Foraging for plants yielded around 1000 calories per hour. Whereas hunting provided at least 10,000. Even with all those hours spent tracking and failing.
A factor of 10 makes this a non-contest. Hunting was a far more fruitful endeavour, and if our ancestors liked to increase their odds of survival, they’d go chasing mammoths.
One medium-sized woolly mammoth could feed a tribe of 50 humans for several months. And yes, they did it without freezers.
6. Stomach Acidity
In place of freezers, we had furnaces for stomachs, capable of cooking up a ton of acid. The more acidic a stomach is, the better it is at breaking down meat. And on that count, humans have more stomach acidity than many carnivores. Omnivores come in with a pH of 2.9, carnivores at 2.2, while we have an average of 1.5. That puts us a level higher, with the scavengers. Creatures that are perfectly fine with tearing at rotting carcasses.
Our ancestors were scavengers before becoming hunters. We ate the meat that the big fish either left because they were full, or they just couldn’t reach. But by the time we could get our hands on the leftovers, the carcass would be getting past its expiry date. Leftovers would also be tough and hard to digest. The high stomach acidity was a mechanism that enabled us to break it all down without too much fuss.
7. Omega 3
DHA, an Omega 3 fatty acid, plays a critical role in the structure and makeup of the human brain. It was part of what enabled the brain to increase in size during the Pleistocene. This is a problem for plant advocates, because the conversion of ALA (Plant Omega 3), to DHA is less than 5%. It’s extremely inefficient, making plants a terrible source for DHA.
Some use this to support the Shoreline or Aquatic Ape Hypothesis, where it was wet foods like clams and oysters that drove brain growth. While Omega 3 is certainly depleted in modern terrestrial animals, it would have been greater back in the Pleistocene. It’s certainly conceivable that we would have had plenty of DHA to go around with the megafauna. Survival didn’t mean hugging the rivers and beaches.
8. Shrinking Gut
The brain evolved to become an energy hog, taking up 20% of our calorie needs. In order to fuel such an intensive organ, the body had to reevaluate its budget, and the digestive tract took the fall. While the small intestine actually increased in size, the colon was drastically reduced, cutting down on the overall gut space.
The small intestine is critical for breaking down proteins and fats, and a longer organ is characteristic of carnivores. Whereas the colon acts to ferment fiber. A gorilla can glean 60% of its energy from fiber, while humans get less than 4%. We lost any ability to digest cellulose, the dominant compound in plants. A smaller gut is primed for taking on energy-dense foods that take up less space.
Much like ribeyes, or the mammoth equivalent.
This also meant we spent far less time chomping down on food, about 5% of daily activity. While chimpanzees, our contemporary plant-based cousins, spend 48%. A change that’s visibly apparent in the gnashers.
9. A Smaller Jaw
It might be time to bring our former plant-based cousins back into this, the extinct Paranthropus Robustus. They had a set of humongous jaws that put ours to shame. Paranthropus in turn descended from Australopithecus, the precursors to the Homo line, who also exhibited a large set of molars. Whereas we shifted towards more meat consumption, foods that were far softer on the palette than pre-fire fiber. With less chewing, came smaller jaws.
10. Lack Of Dental Decay
Teeth are built to last, and in fact they tend to preserve better once you’re dead. Which can be due to the lack of sugar coming in. By all accounts, plants are responsible for dental caries, and that’s reflected by caries in fossilised human remains being a rarity until around 13-15,000 years ago.
That’s right around when humans began to shift into agricultural cuisine.
11. Isotope Data
Nitrogen isotopes can be found in fossilised human collagen residues, and when measured, it provides the most extensively used method for measuring human trophic level. HTL, the amount of energy it takes to make food, is essentially the matter being debated in this article. A higher HTL essentially puts the bearer closer to the top of the food chain. Plants eat up nitrogen from the soil, cows eat the plants, we eat the cow, and so on.
Isotope data of early human remains, show ranges that put us squarely in the domain of carnivory. Neanderthals, if we can make a leap across to our old rivals, were high carnivores. Humans show similar numbers, even when pushing into the Mesolithic, which is the intermediary period that came after the Pleistocene epoch, around 15,000 to 5,000 years ago. That’s at a time when agriculture would have already been in the frame in the heavily populated regions.
12. Hunting Tools
You’d be hard-pressed to find much archaeological evidence of primal foraging equipment, but there’s plenty of proof of the innovations we made on the hunting front. And it dates back before we had the fire needed to break down the tough fibrous plants. Even before we had the sharp pointy objects to bring down megafauna. The first evidence of tool use, dating back 2 million years, tell a story of how humans first entered the meat-eating scene. Chunks of stone that would have been best served for breaking apart bones.
The brain of a carcass was stacked with critical nutrients, including DHA, but it was protected by thick skulls that carnivores couldn’t get past. Humans used sharp flakes of rock to crack open bone and access these goldmines, along with the marrow.
Humans then steadily evolved from scavengers into hunters, showcased by hand axes and thrusting spears around 600,000 years ago. Fire itself entered the scene around 1.5 million years ago, and would have certainly added some flavour to the process. In any case, the majority of archaeological data we have of the Pleistocene points squarely at meat-eating being the dominant driver of tool innovations.
13. Plants Weren’t That Great Back Then
To wrap this up, the stone age world wasn’t the Garden of Eden paradise it’s often made out to be. You only have to look at how a shrivelled and bitter wild apple matches up against the juiced up supermarket version. Fruits, the drug dealers of the plant world, looked nothing like they do now. Before domestication, wheat would have been found in isolated patches of wilderness. And just like apples, they were much smaller.
Many of the fruits and vegetables you’ll be familiar with today, are actually hybrids of ancient plants. They’ve been genetically bred to be more palatable. Early humans hadn’t got round to crossbeeding just yet.
Tubers, which paleo diets hold up as one of the original human foods, weren’t digestible until we came up with a relevant enzyme, much later along the timeline. Not to mention that before humans found fire, these foods would have been far too tough to have been considered worthwhile.
In short, foraging for plants was a task of high efforts and measly rewards. Not just in terms of taste, but calorie yield. If they did get on the menu, it would have been predominantly as fallback foods for when meat was scarce.
Our Ancestors Were Meat Based
I’m not attempting to paint the picture that early humans were exclusive members of the carnivore club, and there’s no reason we wouldn’t have dabbled in mushroom, fruits, and even tubers. But there shouldn’t be any doubt cast as to the foods we preferred. If there was a herd of mammoths lumbering around the vicinity of the camp, it wasn’t an opportunity to pass up. Plants, at best, were fallback foods. Or if our ancestors could find the right ingredients, recreational foods.
The evidence on early humans is pretty conclusive, that they…
- Relied on fatty meat for energy
- Evolved from scavengers to hunters
- Considered the megafauna as the food of choice
- Wiped them out
- Had to come up with a Plan B
As for that Plan B, we’re living with the results now. On one side, we strapped the civilisation rocket to it and landed on the moon. But on the other, our health is going from bad to worse. The agricultural revolution was in effect, an attempt to cheat nature, and nature has an intimidating track record of getting the last laugh.
More Guides On Meat-Based Living