By Dr. Andrew Weil HuffPost Healthy Living
The advice to have five or six small meals daily has become common in recent years. I am 69 years old and don't recall ever hearing this as a child and seldom as a young adult, but by the 1980s it seemed to be everywhere. Today, it is close to nutritional dogma. It is not surprising that an online search of the phrase "eat many small meals" returns over 275,000 results.
The usual justification for eating extra meals is that it keeps the metabolism "revved up" so that weight loss is easier. There is, however, very little hard evidence that supports this idea, and a fair amount that disputes it. For example, a research analysis published in the British Journal of Nutrition concluded that "any effects of meal pattern on the regulation of body weight" appear to be negligible, and what matters is total food intake.
Worse, the "eat many small meals" advice has two clearly negative effects:
In practice, those extra meals usually aren't vegetable-intensive, home-cooked ones. These days, they are likely to be "energy bars" (a euphemism for candy bars), snack mixes, and so on. In other words, high-glycemic-load processed snacks.
When people are told to "eat many small meals," what they may actually hear is "eat all the time," making them likely to respond with some degree of compulsive overeating. It's no coincidence, I think, that obesity rates began rising rapidly in the 1980s more or less in tandem with this widespread endorsement of more frequent meals. (The other major culprit was the government's scientifically shaky "low-fat" dietary recommendation that led to rampant overconsumption of carbohydrates.) In my travels around the world, I am often struck by how rarely I see people eating in their cars, or while strolling down the street, or otherwise outside the traditional time and space boundaries of a meal. In the U.S., these behaviors are ubiquitous.
So the time has come to explore the opposite idea: regularly allowing greater-than-normal amounts of time to pass between meals, a practice known as "intermittent fasting," or IF. Frankly, today in America, simply eating three meals with no snacks might be called a form of IF, if only by way of contrast. If we were to return to this once-common practice, I believe we would be healthier for it.
The basic premise of IF is to enjoy better health via repeatedly fasting for longer periods than is typical on a daily breakfast-lunch-dinner schedule. Variations are endless. Some proponents skip breakfast; others, dinner. Others fast all day every other day, every third day, once per week, or once per month. A friend I know who travels for work six to eight times annually always fasts on the first and last days of his trips, reasoning that airline food is awful anyway. (Fasting, it should be pointed out, means abstaining from solid food; all sensible IF plans allow hydration with water, tea or other no- or low-calorie beverages.)
An IF regime works, proponents say, because it aligns with our evolutionary history. Over the 250,000 years that Homo sapiens have been around, food supply has waxed and waned. We evolved to take advantage of this fact, building muscle and fatty tissue during times of abundance, then paring it back during lean ones. Fasting periods accelerate the clearing-out of waste left by dead and damaged cells, a process known as autophagy. A failure of autophagy to keep up with accumulated cellular debris is believed by many scientists to be one of the major causes of the chronic diseases associated with aging.
Occasional fasting also seems to boost activity and growth of certain types of cells, especially neurons. This may seem odd, but consider it from an evolutionary perspective -- when food is scarce, natural selection would favor those whose memories ("Where have we found food before?") and cognition ("How can we get it again?") became sharper.
Research indicates that the benefits of IF may be similar to those of caloric restriction (CR) in which there are regular meals, but portions are smaller than normal. The advantage of IF, proponents say, is that it's easier to feel sharp hunger occasionally rather than the mild hunger of CR virtually all the time.
The positive effects of IF have been chronicled in a variety of animal and human studies, starting with a seminal experiment in 1946, when University of Chicago researchers discovered that denying food every third day boosted rats' lifespans by 20 percent in males, 15 percent in females. A 2007 review by University of California, Berkeley, researchers concluded that alternate-day fasting may:
Decrease cardiovascular disease risk.
Decrease cancer risk.
Lower diabetes risk (at least in animals, data on humans were less clear, possibly because the trial periods in the studies were not long enough to show an effect).
Improve cognitive function.
Protect against some effects of Alzheimer's and Parkinson's diseases.
What should we make of this?
I don't recommend IF for everyone. Children under 18 should not fast, nor should diabetics, nor pregnant or lactating women. Some health conditions -- such as severe gastrointestinal reflux disease, or GERD -- are easier to manage when food intake is more regular.
But I do think the evidence for the health benefits of IF should make us rethink what seems to be a modern cultural imperative: to avoid hunger at all costs. To the contrary, getting hungry now and then is clearly a healthy thing to do as long as overall caloric intake stays high enough to maintain a healthy weight. (Fasting, like every other healthy activity, must be done sensibly and in moderation.) Many people who follow IF regimes report both physical and mental benefits, including improved energy and concentration, better sleep, and an overall feeling of well-being.
If you practice IF, please share your experiences in the comments below -- what's your eating-fasting pattern, and what health effects have you noticed?