This morning on CNN headline news I heard about results of a new study on coffee and tea consumption and type 2 diabetes. It was a quick 30 second or so bit in which the reporter claimed that coffee and tea could reduce the risk of type 2 diabetes up to 25%.
Think about it this way. The study, published yesterday in the Archives of Internal Medicine, found that coffee and tea drinkers (decaf or not didn’t matter) have about a 7% decreased risk of type 2 diabetes per cup of coffee drunk on average a day. People who drank about 3-4 cups of coffee or tea had a 25% decreased risk of diabetes. However, what if the non-coffee or tea drinkers were instead consuming their caffeine (or if they prefer, decaffeinated beverage) through sugary, high calorie drinks like soda? Maybe the tea and coffee drinkers instead have the baseline risk of disease, but the soda drinkers have an elevated risk! So really, if the study had instead looked at consumption of non-diet soda, the headline would have read something like “Soda can increase the risk of diabetes.”
But wait! We know that’s probably true. Numerous studies have drawn the link between the risk of drinking soda and getting type 2 diabetes. In fact, one study in the Journal of the American Medical Association published in 2004 looked the risk of developing type 2 diabetes in women. It showed that women who consumed one or more sugary drinks per day had nearly double the risk of developing diabetes compared to those who drank less than one per day!
When I googled around to see who else reported on this study (it’s everywhere!), I found few that pointed out that coffee and tea may not be doing anything. Most loudly touted the possible benefits of coffee and tea. Take, for example, this one from WebMD, “Coffee, Tea May Stall Diabetes”, it proclaims. Or this one from ABC, not shouting quite as loudly, “Tea, Coffee May Protect Against Diabetes”.
Apparently reporters think using the word “may” in the title is good enough. Just use “may” and the rest of the article can be all unstudied, wildly speculative, totally misleading scientific miscommunications to the public.
I must admit, I did find one wonderful article on the study. The Guardian in the UK had an amazing piece where they broke down the study point by point in a really succinct, easy to follow way. The article answered a few basic questions about the study starting with “What do we know already?”, then “What does the new study show?”, and, I nearly peed my pants when I saw this, “How reliable are the findings?” They even go on to tell who did the study (a collaboration of researchers from Australia, France, Scotland, and the US). Then they ask “What does it mean for me?”, and “What should I do now?” What an absolutely fantastic way to report science to the public! They didn’t twist the study to mean something it does not necessarily mean just to get a shiny headline. I think all journalists need to look at this simple layout and realize that these are the essential questions they need to be answering in every single piece of reporting they do.
And readers, next time you see a report telling you that such-and-such causes this or that, take a moment to ask, does it really cause it? Or could there be another explanation?