“For me, a calorie is a unit of measurement that’s a real pain in the rear.”
Bo Nash is 38. He lives in Arlington, Texas, where he’s a technology director for a textbook publisher. He has a wife and child. And he’s 5’10” and 245 pounds—which means he is classed as obese.
In an effort to lose weight, Nash uses an app to record the calories he consumes and a Fitbit band to track the energy he expends. These tools bring an apparent precision: Nash can quantify the calories in each cracker crunched and stair climbed. But when it comes to weight gain, he finds that not all calories are equal. How much weight he gains or loses seems to depend less on the total number of calories, and more on where the calories come from and how he consumes them. The unit, he says, has a “nebulous quality to it.”
Tara Haelle is also obese. She had her second son on St Patrick’s Day in 2014, and hasn’t been able to lose the 70 pounds she gained during pregnancy. Haelle is a freelance science journalist, based in Illinois. She understands the science of weight loss, but, like Nash, doesn’t see it translate into practice. “It makes sense from a mathematical and scientific and even visceral level that what you put in and what you take out, measured in the discrete unit of the calorie, should balance,” says Haelle. “But it doesn’t seem to work that way.”
Nash and Haelle are in good company: More than two-thirds of American adults are overweight or obese. For many of them, the cure is diet: One in three are attempting to lose weight in this way at any given moment. Yet there is ample evidence that diets rarely lead to sustained weight loss. These are expensive failures. This inability to curb the extraordinary prevalence of obesity costs the United States more than $147 billion in healthcare, as well as $4.3 billion in job absenteeism and yet more in lost productivity.
At the heart of this issue is a single unit of measurement—the calorie—and some seemingly straightforward arithmetic. “To lose weight, you must use up more calories than you take in,” according to the Centers for Disease Control and Prevention. Dieters like Nash and Haelle could eat all their meals at McDonald’s and still lose weight, provided they burn enough calories, says Marion Nestle, a professor of nutrition, food studies, and public health at New York University. “Really, that’s all it takes.”
But Nash and Haelle do not find weight control so simple. And part of the problem goes way beyond individual self-control. The numbers logged in Nash’s Fitbit, or printed on the food labels that Haelle reads religiously, are at best good guesses. Worse yet, as scientists are increasingly finding, some of those calorie counts are flat-out wrong—off by more than enough, for instance, to wipe out the calories Haelle burns by running an extra mile on a treadmill. A calorie isn’t just a calorie. And our mistaken faith in the power of this seemingly simple measurement may be hindering the fight against obesity.
The process of counting calories begins in an anonymous office block in Maryland. The building is home to the Beltsville Human Nutrition Research Center, a facility run by the US Department of Agriculture. When we visit, the kitchen staff are preparing dinner for people enrolled in a study. Plastic dinner trays are laid out with meatloaf, mashed potatoes, corn, brown bread, a chocolate-chip scone, vanilla yoghurt and a can of tomato juice. The staff weigh and bag each item, sometimes adding an extra two-centimetre sliver of bread to ensure a tray’s contents add up to the exact calorie requirements of each participant. “We actually get compliments about the food,” says David Baer, a supervisory research physiologist with the department.
The work that Baer and colleagues do draws on centuries-old techniques. Nestle traces modern attempts to understand food and energy back to a French aristocrat and chemist named Antoine Lavoisier. In the early 1780s, Lavoisier developed a triple-walled metal canister large enough to house a guinea pig. Inside the walls was a layer of ice. Lavoisier knew how much energy was required to melt ice, so he could estimate the heat the animal emitted by measuring the amount of water that dripped from the canister. What Lavoisier didn’t realize—and never had time to find out; he was put to the guillotine during the Revolution—was that measuring the heat emitted by his guinea pigs was a way to estimate the amount of energy they had extracted from the food they were digesting.
Until recently, the scientists at Beltsville used what was essentially a scaled-up version of Lavoisier’s canister to estimate the energy used by humans: a small room in which a person could sleep, eat, excrete, and walk on a treadmill, while temperature sensors embedded in the walls measured the heat given off and thus the calories burned. (We now measure this energy in calories. Roughly speaking, one calorie is the heat required to raise the temperature of one kilogram of water by one degree Celsius.) Today, those ‘direct-heat’ calorimeters have largely been replaced by ‘indirect-heat’ systems, in which sensors measure oxygen intake and carbon dioxide exhalations. Scientists know how much energy is used during the metabolic processes that create the carbon dioxide we breathe out, so they can work backwards to deduce that, for example, a human who has exhaled 15 liters of carbon dioxide must have used 94 calories of energy.
The facility’s three indirect calorimeters are down the halls from the research kitchen. “They’re basically nothing more than walk-in coolers, modified to allow people to live in here,” physiologist William Rumpler explains as he shows us around. Inside each white room, a single bed is folded up against the wall, alongside a toilet, sink, a small desk and chair, and a short treadmill. A couple of airlocks allow food, urine, feces, and blood samples to be passed back and forth. Apart from these reminders of the room’s purpose, the vinyl-floored, fluorescent-lit units resemble a 1970s dorm room. Rumpler explains that subjects typically spend 24 to 48 hours inside the calorimeter, following a highly structured schedule. A notice pinned to the door outlines the protocol for the latest study:
6:00 to 6:45pm – Dinner,
11:00pm – Latest bedtime, mandatory lights out,
11:00pm to 6:30am – Sleep, remain in bed even if not sleeping.
In between meals, blood tests, and bowel movements, calorimeter residents are asked to walk on the treadmill at 3 miles per hour for 30 minutes. They fill the rest of the day with what Rumpler calls “low activity”. “We encourage people to bring knitting or books to read,” he says. “If you give people free hand, you’ll be surprised by what they’ll do inside the chamber.” He tells us that one of his less cooperative subjects smuggled in a bag of M&Ms, and then gave himself away by dropping them on the floor.
Using a bank of screens just outside the rooms, Rumpler can monitor exactly how many calories each subject is burning at any moment. Over the years, he and his colleagues have aggregated these individual results to arrive at numbers for general use: how many calories a 120 pound woman burns while running at 4.0 miles an hour, say, or the calories a sedentary man in his 60s needs to consume every day. It’s the averages derived from thousands of extremely precise measurements that provide the numbers in Bo Nash’s movement tracker and help Tara Haelle set a daily calorie intake target that is based on her height and weight.
Measuring the calories in food itself relies on another modification of Lavoisier’s device. In 1848, an Irish chemist called Thomas Andrews realised that he could estimate calorie content by setting food on fire in a chamber and measuring the temperature change in the surrounding water. (Burning food is chemically similar to the ways in which our bodies break food down, despite being much faster and less controlled.) Versions of Andrews’s ‘bomb calorimeter’ are used to measure the calories in food today. At the Beltsville centre, samples of the meatloaf, mashed potatoes and tomato juice have been incinerated in the lab’s bomb calorimeter. “We freeze-dry it, crush into a powder, and fire it,” says Baer.
Humans are not bomb calorimeters, of course, and we don’t extract every calorie from the food we eat. This problem was addressed at the end of the 19th century, in one of the more epic experiments in the history of nutrition science. Wilbur Atwater, a Department of Agriculture scientist, began by measuring the calories contained in more than 4,000 foods. Then he fed those foods to volunteers and collected their feces, which he incinerated in a bomb calorimeter. After subtracting the energy measured in the feces from that in the food, he arrived at the Atwater values, numbers that represent the available energy in each gram of protein, carbohydrate, and fat. These century-old figures remain the basis for today’s standards. When Baer wants to know the calories per gram figure for that night’s meatloaf, he corrects the bomb calorimeter results using Atwater values.
This entire enterprise, from the Beltsville facility to the numbers on the packets of the food we buy, creates an aura of scientific precision around the business of counting calories. That precision is illusory.
The trouble begins at source, with the lists compiled by Atwater and others. Companies are allowed to incinerate freeze-dried pellets of product in a bomb calorimeter to arrive at calorie counts, though most avoid that hassle, says Marion Nestle. Some use the data developed by Atwater in the late 1800s. But the Food and Drug Administration (FDA) also allows companies to use a modified set of values, published by the Department of Agriculture in 1955, that take into account our ability to digest different foods in different ways.
Atwater’s numbers say that Tara Haelle can extract 8.9 calories per gram of fat in a plate of her favourite Tex-Mex refried beans; the modified table shows that, thanks to the indigestibility of some of the plant fibres in legumes, she only gets 8.3 calories per gram. Depending on the calorie-measuring method that a company chooses—the FDA allows two more variations on the theme, for a total of five—a given serving of spaghetti can contain from 200 to 210 calories. These uncertainties can add up. Haelle and Bo Nash might deny themselves a snack or sweat out another few floors on the StairMaster to make sure they don’t go 100 calories over their daily limit. If the data in their calorie counts is wrong, they can go over regardless.
There’s also the issue of serving size. After visiting over 40 U.S. chain restaurants, including Olive Garden, Outback Steak House, and PF Chang’s China Bistro, Susan Roberts of Tufts University’s nutrition research center and colleagues discovered that a dish listed as having, say, 500 calories could contain 800 instead. The difference could easily have been caused, says Roberts, by local chefs heaping on extra french fries or pouring a dollop more sauce. It would be almost impossible for a calorie-counting dieter to accurately estimate their intake given this kind of variation.
Even if the calorie counts themselves were accurate, dieters like Haelle and Nash would have to contend with the significant variations between the total calories in the food and the amount our bodies extract. These variations, which scientists have only recently started to understand, go beyond the inaccuracies in the numbers on the back of food packaging. In fact, the new research calls into question the validity of nutrition science’s core belief that a calorie is a calorie.
Using the Beltsville facilities, for instance, Baer and his colleagues found that our bodies sometimes extract fewer calories than the number listed on the label. Participants in their studies absorbed around a third fewer calories from almonds than the modified Atwater values suggest. For walnuts, the difference was 21 percent. This is good news for someone who is counting calories and likes to snack on almonds or walnuts: He or she is absorbing far fewer calories than expected. The difference, Baer suspects, is due to the nuts’ particular structure: “All the nutrients—the fat and the protein and things like that—they’re inside this plant cell wall.” Unless those walls are broken down—by processing, chewing, or cooking—some of the calories remain off-limits to the body, and thus are excreted rather than absorbed.
Another striking insight came from an attempt to eat like a chimp. In the early 1970s, Richard Wrangham, an anthropologist at Harvard University and author of the book Catching Fire: How cooking made us human, observed wild chimps in Africa. Wrangham attempted to follow the entirely raw diet he saw the animals eating, snacking only on fruit, seeds, leaves, and insects such as termites and army ants. “I discovered that it left me incredibly hungry,” he says. “And then I realised that every human eats their food cooked.”
Wrangham and his colleagues have since shown that cooking unlaces microscopic structures that bind energy in foods, reducing the work our gut would otherwise have to do. It effectively outsources digestion to ovens and frying pans. Wrangham found that mice fed raw peanuts, for instance, lost significantly more weight than mice fed the equivalent amount of roasted peanut butter. The same effect holds true for meat: There are many more usable calories in a burger than in steak tartare. Different cooking methods matter, too. In 2015, Sri Lankan scientists discovered that they could more than halve the available calories in rice by adding coconut oil during cooking and then cooling the rice in the refrigerator.
Wrangham’s findings have significant consequences for dieters. If Nash likes his porterhouse steak bloody, for example, he will likely be consuming several hundred calories less than if he has it well-done. Yet the FDA’s methods for creating a nutrition label do not for the most part account for the differences between raw and cooked food, or pureed versus whole, let alone the structure of plant versus animal cells. A steak is a steak, as far as the FDA is concerned.
Industrial food processing, which subjects foods to extremely high temperatures and pressures, might be freeing up even more calories. The food industry, says Wrangham, has been “increasingly turning our food to mush, to the maximum calories you can get out of it. Which, of course, is all very ironic, because in the West there’s tremendous pressure to reduce the number of calories you’re getting out of your food.” He expects to find examples of structural differences that affect caloric availability in many more foods. “I think there is work here for hundreds and probably thousands of nutritionists for years,” he says.
There’s also the problem that no two people are identical. Differences in height, body fat, liver size, levels of the stress hormone cortisol, and other factors influence the energy required to maintain the body’s basic functions. Between two people of the same sex, weight, and age, this number may differ by up to 600 calories a day—over a quarter of the recommended intake for a moderately active woman. Even something as seemingly insignificant as the time at which we eat may affect how we process energy. In one recent study, researchers found that mice fed a high-fat diet between 9am and 5pm gained 28 percent less weight than mice fed the exact same food across a 24-hour period. The researchers suggested that irregular feedings affect the circadian cycle of the liver and the way it metabolizes food, thus influencing overall energy balance. Such differences would not emerge under the feeding schedules in the Beltsville experiments.
Until recently, the idea that genetics plays a significant role in obesity had some traction: Researchers hypothesized that evolutionary pressures may have favored genes that predisposed some people to hold on to more calories in the form of added fat. Today, however, most scientists believe we can’t blame DNA for making us overweight. “The prevalence of obesity started to rise quite sharply in the 1980s,” says Nestle. “Genetics did not change in that ten- or twenty-year period. So genetics can only account for part of it.”
Instead, researchers are beginning to attribute much of the variation to the trillions of tiny creatures that line the coiled tubes inside our midriffs. The microbes in our intestines digest some of the tough or fibrous matter that our stomachs cannot break down, releasing a flow of additional calories in the process. But different species and strains of microbes vary in how effective they are at releasing those extra calories, as well as how generously they share them with their host human.
In 2013, researchers in Jeffrey Gordon’s lab at Washington University tracked down pairs of twins of whom one was obese and one lean. He took gut microbes from each, and inserted them into the intestines of microbe-free mice. Mice that got microbes from an obese twin gained weight; the others remained lean, despite eating the exact same diet. “That was really striking,” said Peter Turnbaugh, who used to work with Gordon and now heads his own lab at the University of California, San Francisco. “It suggested for the first time that these microbes might actually be contributing to the energy that we gain from our diet.”
The diversity of microbes that each of us hosts is as individual as a fingerprint and yet easily transformed by diet and our environment. And though it is poorly understood, new findings about how our gut microbes affect our overall energy balance are emerging almost daily. For example, it seems that medications that are known to cause weight gain might be doing so by modifying the populations of microbes in our gut. In November 2015, researchers showed that risperidone, an antipsychotic drug, altered the gut microbes of mice who received it. The microbial changes slowed the animals’ resting metabolisms, causing them to increase their body mass by 10 percent in two months. The authors liken the effects to a 30 pound weight gain over one year for an average human, which they say would be the equivalent of an extra cheeseburger every day.
Other evidence suggests that gut microbes might affect weight gain in humans as they do in lab animals. Take the case of the woman who gained more than 40 pounds after receiving a transplant of gut microbes from her overweight teenage daughter. The transplant successfully treated the mother’s intestinal infection of Clostridium difficile, which had resisted antibiotics. But, as of the study’s publication last year, she hadn’t been able to shed the excess weight through diet or exercise. The only aspect of her physiology that had changed was her gut microbes.
All of these factors introduce a disturbingly large margin of error for an individual who is trying, like Nash, Haelle, and millions of others, to count calories. The discrepancies between the number on the label and the calories that are actually available in our food, combined with individual variations in how we metabolize that food, can add up to much more than the 200 calories a day that nutritionists often advise cutting in order to lose weight. Nash and Haelle can do everything right and still not lose weight.
None of this means that the calorie is a useless concept. Inaccurate as they are, calorie counts remain a helpful guide to relative energy values: standing burns more calories than sitting; cookies contain more calories than spinach. But the calorie is broken in many ways, and there’s a strong case to be made for moving our food accounting system away from that one particular number. It’s time to take a more holistic look at what we eat.
Wilbur Atwater worked in a world with different problems. At the beginning of the twentieth century, nutritionists wanted to ensure people were well fed. The calorie was a useful way to quantify a person’s needs. Today, excess weight affects more people than hunger; 1.9 billion adults around the world are considered overweight, 600 million of them obese. Obesity brings with it a higher risk of diabetes, heart disease, and cancer. This is a new challenge, and it is likely to require a new metric.
One option is to focus on something other than energy intake. Like satiety, for instance. Picture a 300 calorie slice of cheesecake: it is going to be small. “So you’re going to feel very dissatisfied with that meal,” says Susan Roberts. If you eat 300 calories of a chicken salad instead, with nuts, olive oil and roasted vegetables, “you’ve got a lot of different nutrients that are hitting all the signals quite nicely,” she says. “So you’re going to feel full after you’ve eaten it. That fullness is going to last for several hours.”
As a result of her research, Roberts has created a weight-loss plan that focuses on satiety rather than a straight calorie count. The idea is that foods that help people feel satisfied and full for longer should prevent them from overeating at lunch or searching for a snack soon after cleaning the table. Whole apples, white fish, and Greek yogurt are on her list of the best foods for keeping hunger at bay.
There’s evidence to back up this idea: In one study, Roberts and colleagues found that people lost three times more weight by following her satiety plan compared with a traditional calorie-based one—and kept it off. Harvard nutritionist David Ludwig, who also proposes evaluating food on the basis of satiety instead of calories, has shown that teens given instant oats for breakfast consumed 650 more calories at lunch than their peers who were given the same number of breakfast calories in the form of a more satisfying omelette and fruit. Meanwhile, Adam Drewnowski, a epidemiologist at the University of Washington, has his own calorie upgrade: a nutrient density score. This system ranks food in terms of nutrition per calorie, rather than simply overall caloric value. Dark green vegetables and legumes score highly. Though the details of their approaches differ, all three agree: Changing how we measure our food can transform our relationship with it for the better.
Individual consumers could start using these ideas now. But persuading the food industry and its watchdogs, such as the FDA, to adopt an entirely new labelling system based on one of these alternative measures is much more of a challenge. Consumers are unlikely to see the calorie replaced by Roberts’s or Drewnowski’s units on their labels any time soon; nonetheless, this work is an important reminder that there are other ways to measure food, ones that might be more useful for both weight loss and overall health.
Down the line, another approach might eventually prove even more useful: personalised nutrition. Since 2005, David Wishart of the University of Alberta has been cataloguing the hundreds of thousands of chemical compounds in our bodies, which make up what’s known as the human metabolome. There are now 42,000 chemicals on his list, and many of them help digest the food we eat. His food metabolome database is a more recent effort: it contains about 30,000 chemicals derived directly from food. Wishart estimates that both databases may end up listing more than a million compounds. “Humans eat an incredible variety of foods,” he says. “Then those are all transformed by our body. And they’re turned into all kinds of other compounds.” We have no idea what they all are, he adds—or what they do.
According to Wishart, these chemicals and their interactions affect energy balance. He points to research demonstrating that high-fructose corn syrup and other forms of added fructose (as opposed to fructose found in fruit) can trigger the creation of compounds that lead us to form an excess of fat cells, unrelated to additional calorie consumption. “If we cut back on some of these things,” he says, “it seems to revert our body back to more appropriate, arguably less efficient metabolism, so that we aren’t accumulating fat cells in our body.”
It increasingly seems that there are significant variations in the way each one of us metabolizes food, based on the tens of thousands—perhaps millions—of chemicals that make up each of our metabolomes. This, in combination with the individuality of each person’s gut microbiome, could lead to the development of personalized dietary recommendations. Wishart imagines a future where you could hold up your smartphone, snap a picture of a dish, and receive a verdict on how that food will affect you as well as how many calories you’ll extract from it. Your partner might receive completely different information from the same dish.
Or maybe the focus will shift to tweaking your microbial community: If you’re trying to lose weight, perhaps you will curate your gut microbiome so as to extract fewer calories without harming your overall health. Peter Turnbaugh cautions that the science is not yet able to recommend a particular set of microbes, let alone how best to get them inside your gut, but he takes comfort from the fact that our microbial populations are “very plastic and very malleable”—we already know that they change when we take antibiotics, when we travel, and when we eat different foods. “If we’re able to figure this out,” he says, “there is the chance that someday you might be able to tailor your microbiome” to get the outcomes you want.
None of these alternatives is ready to replace the calorie tomorrow. Yet the need for a new system of food accounting is clear. Just ask Haelle. “I’m kind of pissed at the scientific community for not coming up with something better for us,” she confesses, recalling a recent meltdown at TGI Friday’s as she navigated a confusing datasheet to find a low-calorie dish she could eat. There should be a better metric for people like her and Nash—people who know the health risks that come with being overweight and work hard to counter them. And it’s likely there will be. Science has already shown that the calorie is broken. Now it has to find a replacement.
You can listen to ‘End of the calorie’, the Gastropod episode accompanying this story, on Soundcloud or through iTunes.
This article first appeared on Mosaic and is republished here under a Creative Commons license.