Let’s face it. There’s more nutrition advice available online than ever.
It’s important for consumers to understand when they are being misled. By being misled, I mean you are only being told part of the story, which means your decision is not an informed one.
So first we will review the different types of evidence followed by 7 red flags that you should keep in your back pocket.
Type of evidence
Compound Interest reviewed the different types of scientific evidence from weakest to strongest (see their chart below). Evidence ranges from anecdotal, which usually includes testimonials and expert opinions, to systematic reviews looking at all of the research together.
Observational studies cannot prove causation but when enough evidence begins to pile up, it can be inferred. Of course, randomized controlled trials are always better but they not always ethical or possible, especially when looking at the effects of diet or nutrients over the decades.
When reading information you want to consider not only the type of evidence being cited but the weight of all the research. Sometimes there really is no scientific evidence for a claim.
Other times research is new and emerging, meaning there have been some positive studies but more are needed to see if the results can be replicated. Then you have times the research may show mixed results.
And the best case is when there is such a large body of evidence for a diet-health link, that health organizations develop consensus statements.
So with this in mind, I came up with 7 red flags to watch out for.
1. Strong claims, no product testing
Whether it’s a dietary supplement, super food being touted or diet program, it’s a red flag when it hasn’t been tested.
Take probiotics as an example. Research shows good bacteria is beneficial for digestion. But even though a product has super-high levels of bacteria, unless it is tested you have no idea if the good bugs survive the acidic environment in the stomach.
You only know the product does what it says it does if it’s been tested.
2. Doesn’t qualify the science
The most common mistake in the blogosphere and in nutrition books is to cherry pick the evidence to fit a nutrition point of view. This happens all the time with popular diets. This classic formula is mentioned in this article written by Dr. David Katz on US News:
It’s something like: lay claim to a revelation; cite the literature selectively to back up your argument; ignore all evidence to the contrary; offer up a scapegoat, silver bullet or both; and whatever you do, don’t say that the only way to get the benefits of eating well and exercising is by eating well and exercising. Oh, and be sure to throw everyone who came before you under the bus!
Another watch out on social media is when a strong claim is made in a post but there’s no source (or study) listed. Just some pretty graph that you have no idea where it came from.
The most credible sources of information will be upfront with readers about the science by saying it’s just emerging or it’s only based on animal studies or maybe there’s no science yet.
I think people deserve to know!
3. Wrong population
If you’re reading about how bad wheat is for everyone and the study cited for support is with celiac patients — and you don’t have celiac disease — that’s a red flag. Or if someone’s talking about toxic carbs and all you can see are diabetes-related studies, and you don’t have diabetes, those results don’t apply to you.
Studies with men do not apply to women and studies with those over 65 don’t apply to those under 40.
In short, the study population matters.
4. Outdated studies
So a blogger writes about how good something is for you (or bad) and links to a set of studies published before 2000. All of those are over 15 years old!
In a decade’s time things change in the research world and if someone is citing very old studies they are either not staying current or are guilty of red flag #2.
5. The information is biased
According to the Academy of Nutrition and Dietetics 2011 Nutrition and You Survey, TV is where most people learn about food and nutrition (67%) followed by magazines (41%), the internet (40%), newspapers (20%), doctors (16%), friends and family (16%), radio (13%) and books (11%).
Bias depends on the source. For example, when studies are reported in the news, the goal is to draw in readers/viewers. So it’s usually based on controversy, one new study and important facts can be left out (like what the other studies say).
A vitamin company’s website is probably not the best place to get research on the benefits of vitamins. I often see bloggers focused on a specific diet, rejecting any study that isn’t in line with their diet while embracing any study that is in favor.
6. Too good to be true
Statements like “proven” and “miracle cure” or any language inferring that X causes Y are “watch outs.” My rule of thumb: the stronger the claim, the more science that is needed to support it.
When I go to sites with those type of claims but can’t easily find research cited, I run fast!
7. Links to article or books for support, not original studies
I find that people without a science background do this most frequently. They will say how something in the diet is horrible and link to an article with someone else saying it. That does not qualify as scientific support. This scenario should make you question the person’s understanding of the topic area and if they are the best person to offer advice.
Research is imperfect and complicated, but without it where would we be? We need it and we need people and companies to cite it responsibility. I will often refer back to this page in my series.
Any of these red flags familiar to you? Anything I missed?
- Try one of Maryann’s books, all sold on Amazon: