One of my biggest interests is using food as medicine. Before I started medical school, I often wondered why doctors seemed to either deny or be oblivious to the power of food as medicine. After all, there was an entire holistic health movement out there touting the existence of a veritable pharmacopoeia of remedies without side-effects for common ailments that could be pulled from a kitchen cabinet. I didn’t buy that it was because doctors were bad people, or that they didn’t care. My own experience did not jive with the idea of dismissing all integrative practitioners as quacks. I realized that the truth had to be in the system – something like a cultural divide, or an ethical divide. By the time I started medical school, I had a burning desire to bridge this particular gap between wellness-minded folks and health professionals (Not mutually exclusive). Now, four years later, I haven’t changed the world, but I have made progress towards understanding why things are the way they are.
As medical students, we were taught to follow what is called “Evidence Based Medicine” – medical recommendations based on the most current research. Evidence based medicine has become more than a doctrine of medicine – it has become a moral obligation. Responsible doctors dispense therapies based on prior research and have stood the test of time in favor of new-fangled drugs and procedures, and for good reason. Drugs are a little like electronics in that being an early adopter can have dire consequences. One example is weight-loss supplements; Danielle Ofri recently wrote a great article on this topic in the New York Times. Every year there is a recall of some new medicine that was yesterdays hope and today’s serioous side-effect of the year. The diabetes drug Avandia is one example of a drug that received hero turned villain award.
It should be noted that the best evidence, or as we call it in the trade, the “gold standard”, are double-blind randomized controlled trials that are properly “powered” – meaning there are enough subjects in the study to reach a significant result. There are plenty of solid studies out of reputable institutions that show a correlation between a type of diet and cardiovascular disease, mental health, and many other illnesses. However, once the studies become more focused on specific effects of specific foods, the quality of the evidence starts to wane. It is very easy to find a study for example, that shows fewer inflammatory markers related to clogged arteries in rats in response to being fed blueberries.
However, to find a good study on humans that says eating a certain amount of blueberries every day means lower cholesterol is a lot harder than finding a study that says you prevent heart disease and stroke in people with normal cholesterol by giving them 40mg of Lipitor. Why? Well, I can’t say I know everything about what it takes to conduct good research, but I would bet my student loans (which are no joke, might I add) that it all comes down to money. The average cost for the research and development for one drug in the 1990s was $802 million. If you look at this figure, then consider that the majority of drug company money is spent on marketing, it gives you some perspective on the kind of numbers we are talking about here. Even if it cost a fraction of this amount to test a food’s effect on a disease process, where would the money come from? The fruit and vegetable industry? The all-powerful herbs lobby? Right…
Regardless of the fact that the evidence for using food as medicine for specific conditions is weaker, there are plenty of reasons even under the current paradigm that providers should be obligated to give patients certain simple food and life-style prescriptions. Many doctors agree, but there are reasons they do not. Stay tuned! I will discuss these in a follow-up post.