10-14-2017 14:04 - edited 10-14-2017 14:50
10-14-2017 14:04 - edited 10-14-2017 14:50
I often hear the sentiment, "counting calories is pointless because, no matter how carefully you try to measure, there are so many sources of error in estimating calories that your results are meaningless." This is completely illogical and wrong, and fitness "experts" who post this on their blogs apparently skipped statistics class to go to the gym.
Here's why: a fundamental principal of statistics is that variance (mean sampling error) decreases with sample size. So yes, while there are many sources of error in estimating the calories we eat, these errors tend to offset each other the more estimates we make.
So, if you estimate the calories of a single food serving, you could be off, high or low, by a lot (or not). But if you track 400 items over a month, the average of all the errors trends towards zero.
(**WARNING** - MATH AHEAD)
You can prove this with a thought experiment in a spreadsheet: load the calories for each of the foods you ate yesterday in Column A. If you ate 18 things, you will have 18 values. In Column B, generate random numbers between 25% and -25% (the expression in Google Sheets is =(rand()-0.5)*50%). These are the errors you are going to apply to each food item. We are assuming up to a 25% measurement error. (Too small? Make it larger, I don't care, the results are the same). You can generate all new random errors by pressing cntl-r to recalculate. In Column C, calculate an adjusted "erroneous" calorie estimate for each item by adding the error in B. The expression for C1 is =A1*(1+B1) -- copy this down to the cells below.
So now, let's assume Column A is the accurate calories for each item, and Column C contains the measurement errors (yes, yes, I know Column A is based on actual measurements which already contain errors but we have to start somewhere, go with me. Assume for this exercise they are accurate). So I can see the 196 calorie omelette serving correlates with a random error of 20.7%, so the erroneous calorie estimate is high at 236. The 383 calorie ribeye serving has a -9.2% error and is estimated low at 348 calories, and so on. So we have 18 calorie variances, some significant, some not, and that all change as I hit cntl-r to recalculate.
So here's the payoff, if you're still reading - if you compare the totals of Columns A and C and calculate the percent difference, you will see the total variance is far less than the 25% errors of the individual food items. Click cntl-r and you will see variances in the range of +/- 5%. If you then replicate those items 28 times to simulate 4 weeks of eating, you will see the variance fall to under 1%. If you divide the difference of the totals by 3,500 calories per pound, you will get the total difference in your weight due to the measurement errors over 4 weeks, which I'm showing at about 0.1 pound (based on 1,777 calories consumed per day).
Projecting my weight to within 0.1 pound in a month is plenty accurate enough for me, and I hope this narrative puts to rest the notion that tracking your food produces "meaningless" results.
10-15-2017 01:10
10-15-2017 01:10
And even if your average is not the real average. You have an average that you can calibrate against your weight trend and then adjust accordingly. So if you always underestimate, on paper it looks like you can eat very little, but in practice you are still getting those calories. The other way around if you always overestimate. So just log everything and then you have a baseline that you can use to start adjusting. But if you don't log you are going in blind and you have no idea what you are adjusting.
(Sorry, I am in a rush, so I hope my post makes sense.)
Karolien | The Netherlands
10-15-2017 10:20 - edited 10-15-2017 10:35
10-15-2017 10:20 - edited 10-15-2017 10:35
@Esya wrote:And even if your average is not the real average. You have an average that you can calibrate against your weight trend and then adjust accordingly. So if you always underestimate, on paper it looks like you can eat very little, but in practice you are still getting those calories. The other way around if you always overestimate. So just log everything and then you have a baseline that you can use to start adjusting. But if you don't log you are going in blind and you have no idea what you are adjusting.
(Sorry, I am in a rush, so I hope my post makes sense.)
@Esya - Exactly! And, yes, it makes sense to me. To my thinking, calories burned and eaten are subject to measurement errors that fall into two categories - random and systemic. The random errors are noise, the average of which diminishes with sample size. The systemic errors create a bias, which can be calibrated out. In my case, Fitbit's bias is to underestimate my calorie deficit by 160 calories per day.
Here are my "raw" results before calibrating out the bias:
And after applying a 160 calorie per day bias:
This is an uncanny agreement between actual and predicted results, which leads me to believe that my underlying assumptions and hypotheses are sound. And, of all the potential sources for systemic error, I believe most of the error is due to Fitbit underestimating my BMR.
10-15-2017 13:08
10-15-2017 13:08
Interesting theories, as usual Dave.
Observation has shown that calorie counting errors tend to fall mostly on one side--that is, underestimating calories consumed (let's leave calories burned aside for the moment).
If a person who is logging their food overestimates their calories in, they will be in a position of losing more weight than they intended. How often does that happen?
If a person who is logging their food underestimates their calories in, they will find the weight is not coming off as expected, or possible even gaining weight. How often does that happen?
My suggestion is that the mental/emotional aspect of eating creates a consistent bias towards underestimating, which is why so many struggle with what should be (as you stated Dave) relatively straightforward math.
.
I think it's easier once a person reaches their maintenance goals, to dial in exactly what kind of deficit/addition is needed to the Fitbit numbers to make the in/out match to the weight plan. It takes time to get to the fine detail that Dave has, to be able to estimate 160 calorie difference. (Mine, BTW, underestimates calories burned by 200 a day).
10-15-2017 20:49 - edited 10-15-2017 20:51
10-15-2017 20:49 - edited 10-15-2017 20:51
@WavyDavey wrote:...
My suggestion is that the mental/emotional aspect of eating creates a consistent bias towards underestimating, which is why so many struggle with what should be (as you stated Dave) relatively straightforward math.
..
@WavyDavey - thanks, and agreed. I think there is a tendency to underestimate, particularly when we operate from the seat of our pants and do not log in writing or an app. When we force ourselves to create a written record, there is less room to hide.
For me, overeating is a behavioral problem with elements related to cravings, habits, and emotions. If we are not careful, we can fall into a pattern of "learned helplessness" wherein we conclude we are powerless to control our behavior and outcomes. When I observe so-called experts advising people that "weight-loss is exceedingly difficult and mysterious, and no matter what you try to do, it won't work", my blood pressure goes up. I think we should encourage people to pursue tactics that work, and food logging is one of the most effective.
10-18-2017 17:13
10-18-2017 17:13
agree with everything in every post here. Except the math- that is just exhausting.
For me its common sense- what I eat, I log. When I get on the scale, I see the fruits of my labor. If I lost weight, I carry on as I did. If I didn't, I make adjustments and get on again. That's it. No mystery- just a strong desire to be at my idea weight.
Elena | Pennsylvania