From Wikipedia, the free encyclopedia - View original article
|This article's factual accuracy is disputed. (April 2013)|
The paleolithic diet is a modern nutritional plan based on the presumed diet of Paleolithic humans. It is based on several controversial premises, the most important of which are: first, that human genetics have scarcely changed since the dawn of agriculture, which marked the end of the Paleolithic era, around 15,000 years ago; second, that modern humans are adapted to the diet or diets of the Paleolithic period; and, third, that it is possible for modern science to discern what such diets consisted of.
The Paleolithic diet consists mainly of fish, grass-fed pasture raised meats, eggs, vegetables, fruit, fungi, roots, and nuts, and excludes what are perceived to be agricultural products; grains, legumes, dairy products, potatoes, refined salt, refined sugar, and processed oils. Certain portions should be established for balance of nutrients to maintain homeostasis.
Proponents argue that modern human populations subsisting on traditional diets, allegedly similar to those of Paleolithic hunter-gatherers, are largely free of diseases of affluence and that Paleolithic diets in humans have shown improved health outcomes relative to other widely-recommended diets.
First popularized in the mid-1970s by gastroenterologist Walter L. Voegtlin, it has been promoted and adapted by a number of authors and researchers in several books and academic journals. A common theme in evolutionary medicine, Voegtlin was one of the first to suggest that following a diet similar to that of the Paleolithic era would improve a person's health. In 1975, he self-published The Stone Age Diet: Based on In-depth Studies of Human Ecology and the Diet of Man, in which he argued that humans are carnivorous animals. He noted that the ancestral Paleolithic diet was that of a carnivore — chiefly fats and protein, with small amounts of carbohydrates. His dietary prescriptions were based on his own medical treatments of various digestive problems, namely colitis, Crohn's disease, irritable bowel syndrome and indigestion.
In 1985, S. Boyd Eaton and Melvin Konner published a paper on Paleolithic nutrition in the New England Journal of Medicine, which attracted wider mainstream medical attention to the concept. Three years later, Eaton, Konner, and Marjorie Shostak published a book about this nutritional approach, which was based on achieving the same proportions of nutrients (fat, protein, carbohydrates, vitamins, and minerals) as were present in the diets of late Paleolithic people. It did not exclude foods that were not available before the development of agriculture. As such, this nutritional approach included skimmed milk, whole-grain bread, brown rice, and potatoes prepared without fat, on the premise that such foods supported a diet with the same macronutrient composition as the Paleolithic diet. In 1989, these authors published a second book on Paleolithic nutrition.
Starting in 1989, Staffan Lindeberg led scientific surveys of the non-westernized population on Kitava, one of the Trobriand Islands of Papua New Guinea. These surveys, collectively referred to as the Kitava Study, found that this population apparently did not suffer from diabetes, hypertension, ischemic heart disease, obesity, or strokes. Starting with the first publication in 1993, scholars with the Kitava Study have published a number of scientific works on the relationship between diet and western disease. In 2003, Lindeberg published a Swedish-language medical textbook on the subject. In 2010, this book was wholly revised, updated, translated and published for the first time in English.
Since the end of the 1990s, a number of medical doctors and nutritionists have advocated a return to a so-called Paleolithic (preagricultural) diet. Proponents of this nutritional approach have published books and created websites to promote their dietary prescriptions. They have synthesized diets from modern foods that emulate nutritional characteristics of the ancient Paleolithic diet. Some of these allow specific foods that would have been unavailable to pre-agricultural peoples, such as some animal products (i.e., dairy), processed oils, and beverages.
The Paleolithic diet seeks to mimic the diet of preagricultural foragers; it generally corresponds to what was available in any of the ecological niches of Paleolithic humans. Based upon commonly available modern foods, it includes cultivated plants and domesticated animal meat as an alternative to the wild sources of the original pre-agricultural diet. The ancestral human diet is inferred from historical and ethnographic studies of modern-day forages as well as archaeological finds, anthropological evidence and application of optimal foraging theory.
The Paleolithic diet consists of foods that can be fished and hunted, such as seafood and meat (including offal), and foods that can be gathered, such as eggs, fruits, herbs, insects, mushrooms, nuts, seeds, spices, and vegetables. The meats recommended for consumption are preferably free of food additives, such as wild game meats and grass-fed beef, since they contain higher levels of omega-3 fats compared with grain-produced domestic meats. Food groups that advocates claim were rarely or never consumed by humans before the Neolithic agricultural revolution are excluded from the diet, mainly dairy products, grains, legumes (e.g., beans and peanuts), processed oils, refined sugar, and salt. Many of these foods would have been available at certain times of the year, and may or may not have been consumed. Some advocates consider the use of oils with low omega-6/omega-3 ratios, such as olive oil, to be healthy and advisable.
On the Paleolithic diet, practitioners are permitted to drink mainly water, and some advocates recommend tea as a healthy drink. Eating a wide variety of plant foods is recommended to avoid high intakes of potentially harmful bioactive substances, such as goitrogens, which are present in some roots, seeds, and vegetables. Unlike raw food diets, all foods may be cooked, without restrictions. But, there are Paleolithic dieters who believe that humans have not adapted to cooked foods since the invention of fire by Homo erectus, and so they eat only foods which are both raw and Paleolithic.
According to certain proponents of the Paleolithic diet, practitioners should derive about 56–65% of their food energy from animal foods and 36–45% from plant foods. They recommend a diet high in protein (19–35% energy) and relatively low in carbohydrates (22–40% energy), with a fat intake (28–58% energy) similar to or higher than that found in Western diets.
Staffan Lindeberg advocates a Paleolithic diet, but does not recommend any particular proportions of plants versus meat or macronutrient ratios. According to Lindeberg, calcium supplementation may be considered when the intake of green leafy vegetables and other dietary sources of calcium is limited.
According to S. Boyd Eaton, "we are the heirs of inherited characteristics accrued over millions of years; the vast majority of our biochemistry and physiology are tuned to life conditions that existed before the advent of agriculture some 10,000 years ago. Genetically our bodies are virtually the same as they were at the end of the Paleolithic era some 20,000 years ago."
The reasoning underlying this nutritional approach is that natural selection had sufficient time to genetically adapt the metabolism and physiology of Paleolithic humans to the varying dietary conditions of that era. But in the 10,000 years since the invention of agriculture and its consequent major change in the human diet, natural selection has had too little time to make the optimal genetic adaptations to the new diet. Physiological and metabolic maladaptations such as diabetes have been seen in Native Americans populations newly introduced to the contemporary Western diet.
More than 70% of the total daily energy consumed by all people in the United States comes from foods such as alcohol, cereals, dairy products, refined sugars, and refined vegetable oils. Advocates of the Paleolithic diet assert these foods contributed little or none of the energy in the typical preagricultural hominin diet and argue that excessive consumption of these novel Neolithic and industrial-era foods is responsible for the current epidemic levels of cancer, cardiovascular disease, high blood pressure, obesity, osteoporosis, and type 2 diabetes in the US and other contemporary Western populations.
Researchers have applied the evolutionary rationale to the paleolithic lifestyle to argue for high levels of physical activity in addition to dietary practices. They suggest that human genes "evolved with the expectation of requiring a certain threshold of physical activity", and the sedentary lifestyle results in abnormal gene expression. Compared to ancestral humans, modern humans often have increased body fat and substantially less lean muscle, which is a risk factor for insulin resistance. Human metabolic processes were evolved in the presence of physical activity–rest cycles, which regularly depleted skeletal muscles of their glycogen stores. To date, it is unclear whether these activity cycles universally included prolonged endurance activity (e.g., persistence hunting) and/or shorter, higher intensity activity. S. Boyd Eaton estimated that ancestral humans spent one-third of their caloric intake on physical activity (1000 cal/day out of the total caloric intake of 3000 cal/day), and that the paleolithic lifestyle was well approximated by the WHO recommendation of the physical activity level of 1.75, or 60 minutes/day of moderate-intensity exercise. L. Cordain estimated that the optimal level of physical activity is on the order of 90 cal/kg/week (900 cal/day for a 70 kg human.)
Critics have questioned the accuracy of the science on which the diet is based. John A. McDougall (M.D), author of The Starch Solution, attempted to discredit the science used to determine the Paleolithic diet, and proposed that the human diet around this time was instead based primarily on starches.
The evolutionary assumptions underlying the Paleolithic diet have been disputed. According to Alexander Ströhle, Maike Wolters and Andreas Hahn, with the Department of Food Science at the University of Hanover, the statement that the human genome evolved during the Pleistocene (a period from 1,808,000 to 11,550 years ago) rests on the gene-centered view of evolution, which they believe to be controversial. They rely on Gray (2001) to argue that evolution of organisms cannot be reduced to the genetic level with reference to mutation, and that there is no one-to-one relationship between genotype and phenotype. They further question the notion that 10,000 years is an insufficient period of time to ensure an adequate adaptation to agrarian diets. They note that alleles conferring lactose tolerance increased to high frequencies in Europe just a few thousand years after animal husbandry was invented. Recent increases in the number of copies of the gene for salivary amylase, which digests starch, appear to be related to the development of agriculture. Referring to Wilson (1994), Ströhle et al. argue that "the number of generations that a species existed in the old environment was irrelevant, and that the response to the change of the environment of a species would depend on the heritability of the traits, the intensity of selection and the number of generations that selection acts." They state that if the diet of Neolithic agriculturalists had been in discordance with their physiology, then this would have created a selection pressure for evolutionary change. Modern humans, such as Europeans, whose ancestors have subsisted on agrarian diets for 400–500 generations, should be somehow adequately adapted to it. In response to this argument, Wolfgang Kopp states that "we have to take into account that death from atherosclerosis and cardiovascular disease (CVD) occurs later during life, as a rule after the reproduction phase. Even a high mortality from CVD after the reproduction phase will create little selection pressure. Thus, it seems that a diet can be functional (it keeps us going) and dysfunctional (it causes health problems) at the same time." Moreover, S. Boyd Eaton and colleagues have indicated that "comparative genetic data provide compelling evidence against the contention that long exposure to agricultural and industrial circumstances has distanced us, genetically, from our Stone Age (sic) ancestors"; however, they mention exceptions such as increased lactose and gluten tolerance, which improve ability to digest dairy and grains, while other studies indicate that human adaptive evolution has accelerated since the Paleolithic.
Referencing Mahner et al. (2001) and Ströhle et al. (2006), Ströhle et al. state that "whatever is the fact, to think that a dietary factor is valuable (functional) to the organism only when there was ‘genetical adaptation’ and hence a new dietary factor is dysfunctional per se because there was no evolutionary adaptation to it, such a panselectionist misreading of biological evolution seems to be inspired by a naive adaptationistic view of life."
The specific plant to animal food ratio in the Paleolithic diet is also a matter of some dispute. The average diet among modern foraging societies is estimated to consist of 64–68% of animal calories and 32–36% of plant calories, with animal calories further divided between fished and hunted animals in varying proportions (most typically, with hunted animal food comprising 26–35% of the overall diet). As part of the Man the Hunter paradigm, this ratio was used as the basis of the earliest forms of the Paleolithic diet by Voegtlin, Eaton and others. To this day, many advocates of the Paleolithic diet consider high percentage of animal flesh to be one of the key features of the diet.
However, great disparities do exist, even between different modern foraging societies. The animal-derived calorie percentage ranges from 25% in the Gwi people of southern Africa, to 99% in Alaskan Nunamiut. The animal-derived percentage value is skewed upwards by hunting-oriented polar foraging societies, who have no choice but to eat animal food because of the inaccessibility of plant foods. Since those environments were only populated relatively recently (for example, Paleo-Indian ancestors of Nunamiut are thought to have arrived in Alaska no earlier than 30,000 years ago), such diets represent recent adaptations rather than conditions that shaped human evolution during much of the Paleolithic. More generally, hunting and fishing tend to provide a higher percentage of energy in forager societies living at higher latitudes. Excluding cold-climate and equestrian foragers results in a diet structure of 52% plant calories, 26% hunting calories, and 22% fishing calories. Furthermore, those numbers may still not be representative of a typical Paleolithic diet, since fishing did not become common in many parts of the world until the Upper Paleolithic period 35-40 thousand years ago, and early humans' hunting abilities were relatively limited,[dubious ] compared to modern foragers, as well (the oldest incontrovertible evidence for the existence of bows only dates to about 8000 BCE, and nets and traps were invented 20,000 to 29,000 years ago).
Another view is that, up until the Upper Paleolithic, humans were frugivores (fruit eaters), who supplemented their meals with carrion, eggs, and small prey such as baby birds and mussels, and, only on rare occasions, managed to kill and consume big game such as antelopes. This view is supported by the studies of higher apes, particularly chimpanzees. Chimpanzees are closest to humans genetically, sharing more than 96% of their DNA code with humans, and their digestive tract is functionally very similar to that of humans. Chimpanzees are primarily frugivores, but they could and would consume and digest animal flesh, given the opportunity. In general, their actual diet in the wild is about 95% plant-based, with the remaining 5% filled with insects, eggs, and baby animals. However, in some ecosystems chimpanzees are predatory, forming parties to hunt monkeys.  Some comparative studies of human and higher primate digestive tracts do suggest that humans have evolved to obtain greater amounts of calories from sources such as animal foods, allowing them to shrink the size of the gastrointestinal tract, relative to body mass, and to increase the brain mass instead.
A difficulty with the frugivore point of view is that humans are established to conditionally require certain long-chain polyunsaturated fatty acids (LC-PUFAs), such as AA and DHA, from the diet. Human LC-PUFA requirements are much greater than chimpanzees' because of humans' larger brain mass, and humans' abilities to synthesize them from other nutrients are poor, suggesting readily available external sources. Pregnant and lactating females require 100 mg of DHA per day. But LC-PUFAs are almost nonexistent in plants and in most tissues of warm-climate animals.
The main sources of DHA in the modern human diet are fish and the fatty organs of animals, such as brains, eyes and viscera. Microalgae is a farmed plant-based source commonly used by vegetarians. Despite the general shortage of evidence for extensive fishing, thought to require relatively sophisticated tools which have become available only in the last 30–50 thousand years, it has been argued that exploitation of coastal fauna somehow provided hominids with abundant LC-PUFAs. Alternatively, it has been proposed that early hominids frequently scavenged predators' kills and consumed parts which were left untouched by predators, most commonly the brain, which is very high in AA and DHA. Just 100 g of scavenged African ruminant brain matter provide more DHA than is consumed by a typical modern U.S. adult in the course of a week. Other authors suggested that human ability to convert alpha-Linolenic acid into DHA, while poor, is, nevertheless, adequate to prevent DHA deficiency in a plant-based diet.
Since the end of the Paleolithic period, several foods that humans rarely or never consumed during previous stages of their evolution have been introduced as staples in their diet. With the advent of agriculture and the beginning of animal domestication roughly 10,000 years ago, during the Neolithic Revolution, humans started consuming large amounts of dairy products, beans, cereals, alcohol and salt. In the late 18th and early 19th centuries, the Industrial revolution led to the large scale development of mechanized food processing techniques and intensive livestock farming methods, that enabled the production of refined cereals, refined sugars and refined vegetable oils, as well as fattier domestic meats, which have become major components of Western diets.
Such food staples have fundamentally altered several key nutritional characteristics of the human diet since the Paleolithic era, including glycemic load, fatty acid composition, macronutrient composition, micronutrient density, acid-base balance, sodium-potassium ratio, and fiber content.
These dietary compositional changes have been theorized as risk factors in the pathogenesis of many of the so-called "diseases of civilization" and other chronic illnesses that have dramatically increased in prevalence since the end of World War II, including obesity, cardiovascular disease, high blood pressure, type 2 diabetes, osteoporosis, autoimmune diseases, colorectal cancer, myopia, acne, depression, and diseases related to vitamin and mineral deficiencies.
"The increased contribution of carbohydrate from grains to the human diet following the agricultural revolution has effectively diluted the protein content of the human diet." In modern forager diets, dietary protein is characteristically elevated (19–35% of energy) at the expense of carbohydrate (22–40% of energy). High-protein diets may have a cardiovascular protective effect and may represent an effective weight loss strategy for the overweight or obese. Furthermore, carbohydrate restriction may help prevent obesity and type 2 diabetes, as well as atherosclerosis. Carbohydrate deprivation to the point of ketosis has been argued both to have negative and positive effects on health.
The notion that preagricultural foragers would have typically consumed a diet relatively low in carbohydrate and high in protein has been questioned. Critics argue that there is insufficient data to identify the relative proportions of plant and animal foods consumed on average by Paleolithic humans in general, and they stress the rich variety of ancient and modern forager diets. Furthermore, substantial evidence exists suggesting many preagricultural foraging societies may have routinely consumed large quantities of carbohydrates in the form of carbohydrate-rich tubers (plant underground storage organs). According to Staffan Lindeberg, an advocate of the Paleolithic diet, a plant-based diet rich in carbohydrates is consistent with the human evolutionary past.
It has also been argued that relative freedom from degenerative diseases was, and still is, characteristic of all forager societies irrespective of the macronutrient characteristics of their diets. Marion Nestle, a professor in the Department of Nutrition and Food Studies at New York University, judging from research relating nutritional factors to chronic disease risks and to observations of exceptionally low chronic disease rates among people eating Asian, Mediterranean, and vegetarian diets, has suggested that plant-based diets may be most associated with health and longevity.
Forager diets have been argued to maintain relatively high levels of monounsaturated and polyunsaturated fats, moderate levels of saturated fats (10–15% of total food energy) as well as a low omega-6:omega-3 fatty acid ratio. Cows fed a grass-based diet produce significant amounts of omega-3 fatty acids compared to grain-fed animals, while minimizing trans fats and saturated fats. This high ratio of polyunsaturated to saturated fats has been challenged. While a low saturated fat intake was argued for it has been argued that foragers would selectively hunt fatter animals and utilise the fattiest parts of the animals (such as bone marrow).
The Paleolithic diet has lower energy density than the typical diet consumed by modern humans. This is especially true in primarily plant-based/vegetarian versions of the diet, but it still holds if substantial amounts of meat are included in calculations. For example, most fruits and berries contain 0.4 to 0.8 calories per gram, vegetables can be even lower than that (cucumbers contain only 0.16 calories per gram). Game meat, such as cooked wild rabbit, is more energy-dense (up to 1.7 calories per gram), but it does not constitute the bulk of the diet by mass/volume at the recommended plant/animal ratios, and it does not reach the densities of many processed foods commonly consumed by modern humans: most McDonalds sandwiches such as the Big Mac average 2.4 to 2.8 calories/gram, and sweets such as cookies and chocolate bars commonly exceed 4 calories/gram.
Low caloric density diets tend to provide a greater satiety feeling at the same energy intake, and they have been shown effective at achieving weight loss in overweight individuals without explicit caloric restrictions.
Even some authors who may otherwise appear to be critical of the concept of Paleolithic diet have argued that high energy density of modern diets, as compared to ancestral/primate diets, contributes to the incidence of diseases of affluence in the industrial world.
Fruits, vegetables, meat and organ meats, and seafood, which are staples of the forager diet, are more micronutrient-dense than refined sugars, grains, vegetable oils, and dairy products in relation to digestible energy. Consequently, the vitamin and mineral content of the diet is very high compared with a standard diet, in many cases a multiple of the RDA. Fish and seafood represent a particularly rich source of omega-3 fatty acids and other micronutrients, such as iodine, iron, zinc, copper, and selenium, that are crucial for proper brain function and development. Terrestrial animal foods, such as muscle, brain, bone marrow, thyroid gland, and other organs, also represent a primary source of these nutrients. Calcium-poor grains and legumes are excluded from the diet. Although, leafy greens like Kale and dandelion greens as well as nuts such as almonds are very high sources of calcium. Also, components in plants make their low calcium amounts much more easily absorbed, unlike items with high calcium content such as dairy Two notable exceptions are calcium (see below) and vitamin D, both of which may be present in the diet in inadequate quantities. Modern humans require much more vitamin D than foragers, because they do not get the same amount of exposure to sun. This need is commonly satisfied in developed countries by artificially fortifying dairy products with the vitamin. To avoid deficiency, a modern human on a forager diet would have to take artificial supplements of the vitamin, ensure adequate intake of some fatty fish, or increase the amount of exposure to sunlight (it has been estimated that 30 minutes of exposure to mid-day sun twice a week is adequate for most people).
Despite its relatively low carbohydrate content, the Paleolithic diet involves a substantial increase in consumption of fruit and vegetables, compared to the Western diet, potentially as high as 1.65 to 1.9 kg/day. Forager diets, which rely on uncultivated, heavily fibrous fruit and vegetables, contain even more. Fiber intake in preagricultural diets is thought to have exceeded 100 g/day. This is dramatically higher than the actual current U.S. intake of 15 g/day.
Unrefined wild plant foods like those available to contemporary foragers typically exhibit low glycemic indices. Moreover, dairy products, such as milk, have low glycemic indices, but are highly insulinotropic, with an insulin index similar to that of white bread. However, in fermented milk products, such as yogurt, the presence of organic acids may counteract the insulinotropic effect of milk in mixed meals. These dietary characteristics may lower risk of diabetes, obesity and other related metabolic syndrome diseases by placing less stress on the pancreas to produce insulin due to staggered absorption of glucose, thus preventing insulin insensitivity.
It has been estimated that people in the Paleolithic era consumed 11,000 mg of potassium and 700 mg of sodium daily.
Diets containing high amounts of animal products, animal protein, processed foods, and other foods that induce and sustain increased acidity of body fluid may contribute to the development of osteoporosis and renal stones, loss of muscle mass, and age-related renal insufficiency due to the body's use of calcium to buffer pH. The paleo diet may not contain the high levels of calcium recommended in the U.S. to prevent these effects. However, because of the absence of cereals and energy-dense, nutrient-poor foods in the ancestral forager diet—foods that displace base-yielding fruits and vegetables—that diet has been estimated to produce a net base load on the body, as opposed to a net acid load, which may reduce calcium excretion.
Furthermore, cereal grains, legumes and milk contain bioactive substances, such as gluten and casein, which have been implicated in the development of various health problems. Consumption of gluten, a component of certain grains, such as wheat, rye and barley, is known to have adverse health effects in individuals suffering from a range of gluten sensitivities, including celiac disease. Since the Paleolithic diet is devoid of cereal grains, it is free of gluten. The paleo diet is also casein-free. Casein, a protein found in milk and dairy products, may impair glucose tolerance in humans.
Compared to Paleolithic food groups, cereal grains and legumes contain high amounts of antinutrients, including alkylresorcinols, alpha-amylase inhibitors, protease inhibitors, lectins and phytates, substances known to interfere with the body's absorption of many key nutrients. Molecular-mimicking proteins, which are basically made up of strings of amino acids that closely resemble those of another totally different protein, are also found in grains and legumes, as well as milk and dairy products. Advocates of the Paleolithic diet have argued that these components of agrarian diets promote vitamin and mineral deficiencies and may explain the development of the "diseases of civilization" as well as a number of autoimmune-related diseases.
Other forms of the diet either exclude fruits or include fruits. Some variants also include dairy products or legumes.
One line of evidence used to support the Paleolithic diet is the decline in human health and body mass purported to have occurred with the adoption of agriculture, at the end of the Paleolithic era. With the introduction of domesticated and processed plant foods, such as cereal grains, in the human diet, there was, in many areas, a general decrease in body stature and dentition size, and an increase in dental caries rates. As agricultural and pastoral peoples increased their populations and drove hunter-gatherers either extinct or into marginal lands, a general decline in health followed the introduction of agriculture in some areas, so it is unclear to what degree this change is related solely or at all to dietary patterns. Other factors such as increased sedentism, different work habits, greater general caloric scarcity, and denser settlement patterns and concomitant disease transmission may also have played important roles. Evidence for the effect of the switch to agriculture on general life expectancy is mixed, with some populations exhibiting an apparent decrease in life expectancy and others an apparent increase.
Based on the subsistence patterns and biomarkers of foragers studied in the last century, advocates argue that modern humans are well adapted to the diet of their Paleolithic ancestor. The diet of modern forager groups is believed to be representative of patterns for humans of fifty to twenty-five thousand years ago, and individuals from these and other technologically primitive societies, including those individuals who reach the age of 60 or beyond, seem to be largely free of the signs and symptoms of chronic disease (such as obesity, high blood pressure, nonobstructive coronary atherosclerosis, and insulin resistance) that universally afflict the elderly in western societies (with the exception of osteoarthritis, which afflicts both populations). Moreover, when these people adopt western diets, their health declines and they begin to exhibit signs and symptoms of "diseases of civilization". In one clinical study, stroke and ischaemic heart disease appeared to be absent in a population living on the island of Kitava, in Papua New Guinea, where a subsistence lifestyle, uninfluenced by western dietary habits, was still maintained.
One of the most frequent criticisms of the Paleolithic diet is that it is unlikely that preagricultural foragers suffered from the diseases of modern civilization simply because they did not live long enough to develop these illnesses, which are typically associated with old age. According to S. Jay Olshansky and Bruce Carnes, "there is neither convincing evidence nor scientific logic to support the claim that adherence to a Paleolithic diet provides a longevity benefit." In response to this argument, advocates of the paleodiet state that while Paleolithic foragers did have a short average life expectancy, modern human populations with lifestyles resembling that of our preagricultural ancestors have little or no diseases of affluence, despite sufficient numbers of elderly. In forager societies where demographic data is available, the elderly are present, but they tend to have high mortality rates and rarely survive past the age of 80, with causes of death (when known) ranging from injuries to measles and tuberculosis.
Critics further contend that food energy excess, rather than the consumption of specific novel foods, such as grains and dairy products, underlies the diseases of affluence. According to Geoffrey Cannon, science and health policy advisor to the World Cancer Research Fund, humans are designed to work hard physically to produce food for subsistence and to survive periods of acute food shortage, and are not adapted to a diet rich in energy-dense foods. Similarly, William R. Leonard, a professor of anthropology at Northwestern University, states that the health problems facing industrial societies stem not from deviations from a specific ancestral diet but from an imbalance between calories consumed and calories burned, a state of energy excess uncharacteristic of ancestral lifestyles.
The first animal experiment on a Paleolithic diet suggested that this diet, as compared with a cereal-based diet, conferred higher insulin sensitivity, lower C-reactive protein and lower blood pressure in 24 domestic pigs. There was no difference in basal serum glucose. The first human clinical randomized controlled trial involved 29 people with glucose intolerance and ischemic heart disease, and it found that those on a Paleolithic diet had a greater improvement in glucose tolerance compared to those on a Mediterranean diet. Furthermore, the Paleolithic diet was found to be more satiating per calorie compared to the Mediterranean diet.
A clinical, randomized, controlled cross-over study in the primary care setting compared the Paleolithic diet with a commonly prescribed diet for type 2 diabetes. The Paleolithic diet resulted in lower mean values of HbA1c, triacylglycerol, diastolic blood pressure, body mass index, waist circumference and higher values of high density lipoprotein when compared to the Diabetes diet. Also, glycemic control and other cardiovascular factors were improved in both diets without significant differences. It is also important to note that the Paleolithic diet was lower in total energy, energy density, carbohydrate, dietary glycemic load and glycemic index, saturated fatty acids and calcium, but higher in unsaturated fatty acids, dietary cholesterol and some vitamins. Two clinical trials designed to test various physiological effects of the Paleolithic diet are currently underway, and the results of one completed trial have shown metabolic and physiologic improvements. The European Journal of Clinical Nutrition published a study of a trial of the Paleolithic diet in 20 healthy volunteers. The study had no control group, and only 14 individuals completed the diet. In the study, in three weeks there was an average weight reduction of 2.3 kg, an average reduction in waist circumference of 1.5 cm (about one-half inch), an average reduction in systolic blood pressure of 3 mm Hg, and a 72% reduction in plasminogen activator inhibitor-1 (which might translate into a reduced risk of heart attack and stroke.) However, the NHS Knowledge Service pointed out that this study, like most human diet studies, relied on observational data. The NHS concluded that the lack of a control group, and the small sample of size of the study, compromises their conclusions. With only 14 participants the study lacks the statistical power to detect health improvements, and perhaps the simple fact that these 14 individuals knew that they were on a diet program made them more aware of weight and exercise regime, skewing the results.
Critics have argued that to the extent that foraging societies fail to suffer from "diseases of civilization", this may be due to reduced calories in their diet, shorter average lifespans, or a variety of other factors, rather than dietary composition. Some researchers have also taken issue with the accuracy of the diet's underlying evolutionary logic or suggested that the diet could potentially pose health risks.
A 2011 ranking by U.S. News & World Report, involving a panel of 22 experts, ranked the Paleo diet lowest of the 20 diets evaluated based on factors including health, weight-loss and ease of following. These results were repeated in the 2012 survey, in which the diet tied with the Dukan diet for the lowest ranking out of 29 diets; U.S. News stated that their experts "took issue with the diet on every measure". In the 2013 rankings, the Paleo diet tied for last place out of 29, and in 2014, it tied for last out of 32. However, one expert involved in the ranking stated that a "true Paleo diet might be a great option: very lean, pure meats, lots of wild plants. The modern approximations… are far from it." He added that "duplicating such a regimen in modern times would be difficult."
The U.S. News ranking assumed a low-carb version of the paleo diet, specifically containing only 23% carbohydrates. Higher carbohydrate versions of the paleo diet, which allow for significant consumption of root vegetables, were not a part of this ranking. Dr. Loren Cordain, a proponent of a low-carbohydrate Paleolithic diet, responded to the U.S. News ranking, stating that their "conclusions are erroneous and misleading" and pointing out that "five studies, four since 2007, have experimentally tested contemporary versions of ancestral human diets and have found them to be superior to Mediterranean diets, diabetic diets and typical western diets in regard to weight loss, cardiovascular disease risk factors and risk factors for type 2 diabetes." The editors of the U.S. News ranking replied that they had reviewed the five studies and found them to be "small and short, making strong conclusions difficult".