Amongst the multitude of diets littered across the modern nutritional landscape, carnivore stands out as one of extremes. It’s ultra-restrictive, and venerates the most maligned food on the market. So it would be fair to call it a little controversial, and according to many, dangerous when taken without medical supervision.
These critics would understandably point at the scarcity of research that’s been undertaken on a diet that’s only recently risen to prominence. There aren’t any large-scale, long-term, randomised controlled trials to refer back to when pondering the merits and pitfalls of combining steak and butter.
On the other hand, there is no lack of literature out there to raise question marks over the impact of meat-fuelled high LDL on clogged arteries, as well as to highlight the importance of getting your vitamins by eating the veggie rainbow.
At least, that’s the way the anti-carnivore argument usually goes. So let’s get into the weeds and see whether carnivore does have a scientific leg to stand on, or whether the hype is fueled entirely by anecdata.
Randomised Controlled Trials On Carnivore
While I appreciate that RCTs are seen as the gold standard for nutrition, and carnivore lacks a good RCT to back it up, it does beg the question. What diets get to boast long-term RCTs? The answer is very few. The mediterranean diet, the DASH diet, the vegan diet, and ketogenic diets have been given the treatment. But that still leaves out the other 99% of diets, and RCTs themselves aren’t exempt from criticism.
They still produce their data from food questionnaires, which leaves them open to recollection errors or straight up cheating. They can’t be double blinded, meaning the dieter will be perfectly aware of what food he or she is ingesting, which will inevitably play a role in dictating the outcome.
Especially when we’re constantly getting screamed at by the media that bacon causes colon cancer risk to skyrocket, and that vegetables are the panacea for our many sins.
So with the absence of RCTs aside, what clinical backing can carnivore muster?
Carnivore Studies
A recent self-reported study of 2029 carnivore dieters, taken over a 6 month period, found that: “Contrary to common expectations, adults consuming a carnivore diet experienced few adverse effects and instead reported health benefits and high satisfaction.”
Notably they recorded that LDL markers were elevated, while HDL and triglycerides were in the optimal range. The authors accepted that this study was merely about collecting preliminary data for the diet, and that the results were consistent with expectations.
98% of the participants reported being satisfied or very satisfied, and 92% discontinued insulin for Type 2 diabetes.
For the first foray of carnivore into diet surveys, it’s not a terrible start.
There are a few that aren’t explicitly carnivore, but are very close to the real thing. Such as the Paleolithic Ketogenic Diet (PKD), which I suppose is ancestral keto. Much like carnivore.
Here we have PKD being used to successfully treat a child with epilepsy. Bear in mind that these are just case studies on one or two subjects, so any takeaways need a heavy helping of salt.
PKD showing positive effects as cancer therapy.
And PKD successfully managing Type 1 diabetes.
Then there’s carnivore itself, which does have a two patient controlled trial of a meat-only diet from way back in 1929, thanks to the exploits of our favourite carnivorous author and explorer, Stefansson.
This is the man who went traipsing around the Arctic, met the Inuits, noted how healthy they were for people who had no chance of finding vegetables, then decided to try it out on himself upon arriving back at civilization. As it turns out, they made it through the one year carnivore course in perfect health, save for one stutter when they were instructed to cut down on the fats. Which shouldn’t be surprising for anyone who understands what makes carnivore tick.
In a more recent piece of research, there was a study conducted on the microbiome, a 10 day 10 person trial placing carnivore against the vegan diet. The gut diversity ended up being similar across both diets, with some beneficial bugs going up, others going down. There’s not much to take away from this one, other than that carnivore doesn’t nuke your gut diversity, despite the notable absence of fiber.
Speaking of fiber, a case study looking at 63 constipated subjects, found that removing fiber altogether relieved their symptoms, often altogether. This was a wild conclusion considering that fiber is sold to us as the force that allows regular bowel function. Apparently filling your digestive tract with indigestible junk isn’t always a good thing.
Studies On Plant Toxins
If the research around carnivore is murky, wait till you try and investigate plant toxins. We know they exist, at least the ones we’ve mapped out to this point. A paper from 1990 stated that 99.99% of pesticides are natural.
Cabbage itself has 49, that we know of. The paper estimated that the average American eats 1.5 grams of natural pesticides every day.
Where it isn’t so clear, is the effects these toxins have on humans. They can certainly be a problem for insects, but humans typically have systems better able to deal with and clear out these compounds.
Plant toxins can certainly kill you, that’s not up for debate. Just walk into a forest, and start eating berries and mushrooms at random. See how long you last. This isn’t medical advice.
Arsenic is a plant toxin, and a big reason why you shouldn’t be trying wild almonds.
Plants aren’t generally fans of accepting their fate each time a predator comes knocking. They understandably prefer to survive, and in the absence of wings to take flight on, they come armed with all manners of pesticides to discourage, incapacitate, and even kill, their oppressors.
The question here is whether the plants that we’ve cultivated and genetically engineered across thousands of years to be edible, are in fact safe for human consumption.
On one hand, we still exist as a species despite ingesting plants on a regular basis over 10,000 years, so our fate implies that plants can at least be tolerable.
But while we don’t have many long-term studies to refer to, there is still strong evidence that some of the more notorious plant toxins in modern cuisine can lead to various states of metabolic dysfunction.
One study suggested that phytic acid, a toxin that binds to and prevents the absorption of key minerals, is responsible for zinc deficiencies in the developing world.
Lectins have been implicated with leaving holes in the epithelium, leading to leaky gut. Gluten, another lectin, is known as the driver of celiac disease.
Oxalates, sharp crystals that accumulate in the body and cause inflammation, are known to lead to kidney stones when they outstrip the body’s ability to excrete them. Resveratrol, a polyphenol with much-touted anti-oxidative properties, is known to cause DNA damage.
In general, we are perfectly aware of plant toxins, and their properties. The controversy arises over what dose makes the poison. Many of these compounds are used as supplements for hormetic benefits, like the aforementioned resveratrol and the broccoli-derived sulforaphane.
Hormesis essentially means you’re training the body to become more resilient by exposing it to a low dose stressor. That’s the gist of the argument that people use in favour of plant compounds. By having your fill of broccoli supplements, you become harder to kill.
The problem is, although it could well be true to an extent, we just don’t know what dose gets us to the sweet spot. Personally, I’d rather err on the safe side and keep those toxins to a minimum, seeing as they’ve not been made with my wellbeing in mind. But I’m totally open to the possibility of certain compounds playing a positive role as hormetic stressors, when used medicinally.
The Palaeolithic Study
Finally, the greatest case that can be made in favour of carnivore, the ancestral one.
There is one long-term study that’s evaded mention in many of these diet debates. The Paleolithic Study, conducted over several million years that predated the existence of homo sapiens, and concluded with the agricultural revolution.
If human existence was spread out over a 24 hour clock, agriculture would have been introduced at 23:54. The time since is a blip compared to what passed before, and that heavily implies that most of our current adaptations were designed to make the most of a palaeolithic environment.
Evolution is the lens through which we can define optimal human nutrition. We get there by defining the stone age cuisine.
Unfortunately we didn’t have lab coats around then to pick up and puzzle over the data, but we have very strong evidence now that our palaeolithic ancestors were, at the very least, hyper carnivores.
One paper by Miki Ben Dor, assessed the trophic level of humans across the palaeolithic.
The trophic level pinpoints our spot in the food chain. Grass would be perched at the bottom, whatever eats the grass comes on the next rung up (vegans), and then the creature that grabs the grass-muncher, and so on.
The first trophic level – Primary producers – Plants and algae
The second trophic level – Herbivores
The third trophic level – Carnivores and omnivores
The fourth trophic level – Carnivores and omnivores
The fifth trophic level – Apex predators
Modern estimates put humans low down the second trophic level, at 2.21, sharing a bench with pigs and anchovies.
The palaeolithic painted a different picture. The study concluded that humans retained a higher trophic level, by looking at eight pieces of evidence.
Stomach acidity – Humans have stomach acidity stronger than carnivores, on par with scavengers.
Adipocyte morphology – Our fat stores are similar to carnivores, with smaller and numerous cells than primates.
Age at weaning – We have an early weaning age, at 2.5-2.8 years, compared to 4.5 to 7.7 with primates. This is once again typical of carnivores.
Stable isotopes – The best proof of trophic level, although it only goes back 50,000 years. Within that timeframe, human fossils show high carnivory until the agricultural revolution.
Behavioural adaptations – We have six social behaviours that resemble carnivores: food sharing, food storage, cannibalism, surplus killing, interspecies intolerance, and alloparenting. Only one behaviour, group defence, resembles primates.
Palaeontological evidence – Large carnivores in Europe went extinct in time with the arrival of humans. The megafauna, or large herbivores, also were wiped out as our ancestors crept across the continents.
Zoological analogy – Species that target large prey are exclusively in the camp of hypercarnivores.
Insulin resistance – Humans have a low physiological insulin sensitivity in order to preserve glucose for the cells that need it, freeing the rest to run on fat instead, much like carnivores. This plays a decisive role in letting us get the best out of ketosis.
This paper didn’t just show that our ancestors engaged in hypercarnivory on a regular basis across the palaeolithic, it also proves that we evolved to have adaptations that enable us to target large prey.
More Proof Of Humans Being Large Prey Specialists
Shoulder anatomy – Our shoulders have the unique ability to throw objects at high speed and with great accuracy. A very useful trait for extra curricular activities like trying to bring down a mammoth without being skewered by its tusks. Whereas our closest cousins, the chimpanzees, display terrible aim whenever they decide that it’s time to start throwing their faeces around.
Gut morphology – The human gut is notably different to our cousins, with the small intestine being 64% longer, and the large intestine 77% shorter. Given that the latter is pivotal for fermenting and making energy out of fiber, it’s not a favourable adaptation for plant specialists. Whereas the small intestine is where fat and protein gets absorbed.
In fact, we can only extract up to 4% of our daily energy needs from fiber. Most natural plant sources are highly fibrous, fruits included.
Brain size – Our brains are over three times larger than other primates, and brain size is correlated with nutrient density. Meat, especially fatty meat, is orders of magnitude more energy dense than fibrous foods, which implies that brains got big through carnivory.
Caloric return – Even the hunting of medium-sized mammals provides at least 10 times the caloric return of gathering plants. Given that the extra brain space increased our energy requirements, that mega herbivores were in plentiful supply, and that calories were the fundamental resource for hunter-gatherers, it makes sense that we would have directed our efforts towards the business with the greatest profits.
Nutrient deficiencies in plants – Meat is far richer in micronutrients than plants, with one study comparing terrestrial mammals with plants, showing that in 8 out of 10 vitamins, meat was several times denser. And that was without accounting for things like bioavailability, which further skews it in favour of meat.
Then there are the critical nutrients that are either nonexistent in plants, or convert extremely poorly, which makes a plant-based diet pretty suspect.
Vitamin A – Beta carotene (plant variant) converts at 8% or lower, with 45% of people being low responders to the plant version.
Vitamin D – Extremely limited in plants.
Omega 3 – ALA (plant variant) converts at just 2-3% to EPA and DHA (human). In DHA’s case, a critical nutrient for brain development, it can be as low as 0%.
Carnitine – No plant sources
Creatine – No plant sources
Carnosine – No plant sources
Taurine – No proven plant sources
Iron – As little as 2% of non-heme iron (plant variant) is absorbed, which can be further dented by plant antinutrients like tannins.
Seasonality of plant foods – None of this should imply that our ancestors never touched plants. They were just inferior options, and would have been extremely scarce over vast swathes of the year. Especially since we had an ice age going on. Plants are seasonal creatures, and the nuts and berries that might have tempted palaeolithic humans, would have only been available for a few weeks of the year.
High fiber content – The vegetables and fruits we eat today carry no resemblance to whatever wild ancestors we bred them from. Six classic supermarket vegetables were all genetically engineered from the same plant. A wild banana is practically inedible due to being 90% seeds and 10% flesh. Bitter apples are, well, bitter.
Given that we struggle to digest any meaningful amount of fiber, and from how unpalatable these ancestral fruit and veggies would have been compared to a juicy mammoth flank, it should lead to the conclusion that plants played the role of fallback foods.
Good for avoiding starvation, not so good for thriving.
Ketosis – Humans can enter ketosis more readily than any other mammal, making them highly adapted to lengthy spells of fasting while being fuelled by body fat. Ketosis meanwhile has many proven therapeutic and antioxidative benefits, suggesting that the body and brain is highly optimised to make the most of the fuel source.
The palaeolithic hunter’s lifestyle would have been split between devouring mammoth steak and spending potentially lengthy periods tracking down the next kill. Ketosis would have allowed the hunter to remain in prime condition for the key phase, even if it came a week after his last meal.
Body fat percentage correlates to animal size – In other words, the larger the animal the greater their fat stores, which placed the target square on the backs of the megafauna. Protein is a much less efficient source of energy, and lean animals like squirrels and rabbits just didn’t have the caloric density that the large herbivores did. With their mass extinctions, humans had little choice but to turn to agriculture as a way to make up the missing calories.
This is why palaeolithic humans weren’t simply hunters, they were fat hunters. And while the megafauna were mostly wiped out, we still have a few species lingering around. We just call them ruminants now.
Wrapping Up
To summarise the literature we have on palaeolithic anthropology, it’s conclusive that our ancestors were at the very least, heavily meat-based, getting at least 70% of our calories for meat. After splitting off from the last common ancestors of chimps and humans, we gradually morphed over millions of years into specialist hunters of large herbivores.
Our brain size increased, the gut shrunk, we mastered ketosis, and wiped out the megafauna through overhunting. With the mass extinction of our primary prey, and the lack of fat reserves in smaller animals, we likely had no choice but to turn to agriculture.
The time that elapsed since, has been nowhere near long enough to roll back the changes in our physiology that were ingrained over the palaeolithic. There have been a few adaptations to make plants more tolerable, like the FADS gene, but they’ve only made the nutrient conversion marginally better.
The argument here isn’t that we only ate meat and nothing else across the palaeolithic. Some cultures would have been in climates inhospitable to plants, while others might not have passed up the opportunity of wrestling honey away from a beehive, or grabbing a handful of berries that they assumed weren’t poisonous.
What the evidence does imply, is that our present existence as apex predators is a testament to our capacity to thrive on meat-fuelled ketosis, and that there is nothing in palaeolithic data to suggest that we specifically need plants to reach optimal health.
Plants aren’t inherently problematic at moderate doses, but they can be for populations with metabolic dysfunction or leaky gut, and modern variants with zero ancestral precedent, like seed oils should be viewed as toxic at any amount.
In addition to that, the plants we eat today do not remotely resemble the ones our ancestors might have stumbled on, whereas we do a very close resemblance to ancestral meat in the form of grass-fed beef.
The sustainability of a restrictive diet like carnivore is always offered up with a question mark, but personally, a trial run of a few million years should be good enough to convince you to take the leap of faith.
It’s the safest diet for the human species, with the fewest toxins (by the omission of plants), and the most nutrients (by the addition of red meat). It worked for us then, and I’ve seen nothing to suggest that it can’t work for us now. Scratchy epidemiological data isn’t enough to overturn millions of years of evolutionary history.
Want To Take On The Carnivore Transformation For Yourself?
Join my online membership where you get all the tools you need to get the best out carnivore’s muscle-building and blubber-melting potential.
The Apex Membership Includes
- Customised carnivore or carnivore-style (with carbs) diet plans for either weight loss or weight gain.
- Dedicated training plans with video tutorials for each exercise
- Regular video check-ins with me
Get More Of My Content On My Socials
On the Carnivore diet we will be metabolically healthy, but will we live longer lives than omnivores? Excess free iron seems to be associated with accelerated aging and disease. From the vein that plants are medicine, phytic acid which we make in our own bodies, is perhaps the best chelator of free iron from our bodies. I have been cycling the supplement IP6 to try to keep my iron in check. Carnivore for one year now. Tried adding fruit, didn’t feel or sleep as well. Have recently added coconut for the healthy saturated fats.
It’s an interesting thought, and I’d imagine that lack of oxidative stress from seed oils, sugar, and plant toxins will reduce the damage taking place with circulating iron. At the same time, there might be some merit in donating blood for keeping a lid on excess levels.
And fruit’s been a no go for me, I’d much rather go for regular glucose when I feel I need it to supplement performance.