Four Ways Health News Can Fool You

Text Size:
Four Ways Health News Can Fool You

Health news comes to us every day through newspapers, radio, TV, and the internet. You’ll read or hear: Do this to live longer! Don’t do that, it will make you sick! Watch out for this new threat to your health!

Sometimes these reports give good information, but sometimes they mislead. How do you separate the useful from the potentially harmful stories? Here are four ways health news can fool us, and keys to seeing through them.

1. Confusing relative risk with absolute (real) risk

Say you read that eating a certain food raises your risk of pancreatic cancer by 50 percent. Pancreatic cancer is a terrible disease, so maybe you should avoid that food, even if it’s one of your favorites.

Except that the overall rate of pancreatic cancer is roughly 1 per 10,000, or 0.01 percent. Raising the risk by half (which is what 50 percent means) would bring the risk to 0.015 percent, or 1.5 in 10,000.

Measuring how risk differs between different groups is called relative risk. It usually seems much more impressive (50 percent increased risk of cancer!) than the absolute risk change (in this example, .5 in 10,000, or 0.005 percent).

Dr. Russell L. Rothman, MD, at Vanderbilt University, says, “Many times, the relative risk sounds much greater than the absolute risk, which can be confusing. When you hear numbers about risk, it’s best to focus on the absolute risk.”

The same applies to benefits, which are risks in reverse. You might read that a drug cuts the risk of hip fracture over a three-year period by 60 percent. You should take that drug, right?

Except this statistic equally means that without taking the drug, say, 1 percent (1 of 100) of people have fractures, and with the drug only 0.4 percent (1 in 250) do.

2. Confusing correlation with causation

When things often occur together — for example, obesity and Type 2 diabetes — health experts often say one caused the other. But even a strong correlation (sometimes called “association”) doesn’t tell us about cause.

Instead of obesity causing Type 2 diabetes, maybe Type 2 causes obesity. Or maybe some other factors cause both obesity and diabetes.

Several such factors are strongly correlated with both obesity and Type 2, including low income, lack of education, membership in an oppressed group, history of trauma, physical disability, and high stress levels.

Such factors are called “confounding variables.” Good studies “control” for confounding variables by making sure two experimental groups have the same levels of these factors.

This can be hard to do. A news report about diabetes treatment, for example, should say something like “after controlling (or adjusting) for age, weight, length of diabetes, and socioeconomic status (SES).”

If a report doesn’t have that kind of language, you should be skeptical of the results. You might want to read an abstract of the actual study at PubMed, the government’s research reporting site.

3. Jumping to conclusions too soon

Lots of treatments work for a little while, then become less effective. Side effects may not appear right away, but appear over time.

Dr. Roy Taylor at Newcastle University in England has reported on several studies in which people with Type 2 diabetes recovered normal insulin function on a very-low-calorie diet. They lost weight, their livers lost fat, and they achieved diabetes remission. Statistically, these results look very impressive. Media trumpeted a new diabetes cure.

None of the studies, though, has followed the patients for more than a year. There is no way to know how long the benefits lasted and no reason to think they necessarily did.

Radical diets typically accomplish rapid weight loss, but 80 percent to 90 percent of subjects regain their weight (and sometimes more) within five years, often much more quickly than that. Even in the short one-year follow-up for the studies outline above, many of the experimental subjects were back on diabetes medication.

Likewise, the negative effects of some drugs don’t show up until they’ve been in use for a while. The diabetes drugs pioglitazone (Actos) and rosiglitazone (Avandia) were both approved and hailed as breakthroughs. After a few years, though, Actos was withdrawn from some markets in Europe for being associated with an increased risk of bladder cancer, as was Avandia for causing increased risk of heart disease.

4. Confusing statistical with clinical significance

Drug studies have to show results strong enough that they were unlikely to have happened by chance. The more participants in the study, the less likely the results occurred by chance. In other words, the results were “statistically significant.”

That’s why drug companies run big trials with hundreds or thousands of patients. In a trial of 30 patients, if a drug reduced blood pressure by an average of 10 points, that might not be statistically significant. Maybe those patients were special.

If a study with 2000 patients got the same 10-point average reduction, that probably would be statistically significant. But it’s still just ten points. It might not be clinically significant, meaning it won’t necessarily help enough to outweigh potential side effects and costs.

Studies can also mislead through bias, where researchers see what they want to see (or what their funders paid them to see) or through overhype, where the results are made out to be more significant than they really are.

Learning to watch out for and question these factors will help you make better use of health news.

Want to learn more about recent Type 1 diabetes research? Read “Reversing Type 1 Diabetes: New Research From Boston Children’s Hospital,” “Can a Very Low-Carb-Diet Help People With Type 1 Diabetes?” and “Vaccine Leads to Lasting Improvement in Type 1 Diabetes,” then see our weekly roundup of breaking diabetes news.

Get Diabetes-Friendly Recipes In Your Inbox

Sign up for Free

Stay Up To Date On News & Advice For Diabetes

Sign up for Free

Get On Track With Daily Lifestyle Tips

Sign up for Free

Save Your Favorites

Save This Article