Vitamin E Experts Respond To Controversial Study

 

 

© Whole Foods

February 2005

 

Vitamin E Experts Respond To Controversial Study

An interview with Drs. Jeffrey Blumberg, Kent DeZee and Maret Traber.

By Richard A. Passwater, Ph.D. and Richard Passwater, Jr.

 

Stop the presses and call CNN! We have just completed a meta-analysis and have found that those taking prescription drugs have a higher risk of dying than those who don’t take prescriptions. Previously, we conducted an observational study and found that when people use umbrellas, there is a higher risk of rain. The media surely will trumpet this “alarming news” and warn against using prescriptions and umbrellas.

Absurd, you say. That’s our point. It was absurd how the media headlined a report given at a conference in November 2004. Evidently, there wasn’t much real news at the conference and since the reporters had to report on something, they jumped on an armchair meta-analysis of primarily sick and medicated patients who were also taking vitamin E, masquerading as if it actually showed that high-dose vitamin E was a risk. The authors of this study advised that they believed their meta-analysis suggested that the use of high-dose vitamin E by such ill patients also taking prescriptions should be evaluated more closely.

We must question the state of journalism when reporters run so hard with an oral report taken out of context of formal peer review and one that flies in the face of all of the extensive existing body of scientific evidence.

The resulting headlines and “breaking news” unfortunately presented the armchair study incorrectly as a warning that 400 IU and more of vitamin E caused early death. This understandably scared many people away from continuing their vitamin E supplements, which were helping to protect them from many diseases. A national survey taken by the Dietary Supplement Information Bureau taken two weeks after the alarmist reports found that 18% of Americans were less likely to take vitamin E based on the news of the meta-analysis.

Where were those reporters two weeks later when a clinical study showed that vitamin E was associated with a reduced risk of heart attacks in 40% of diabetics (Diabetes Care 2767 Nov 2004)? Where were the reporters when a week later researchers reported that people taking vitamin E had a 62% lower risk of dying of ALS (Ann. Neurology Nov 2004)? These two studies were reported in scientific journals and not at conferences in exotic places that draw reporters. Or, was the lack of reporting on the good news about vitamin E simply a case of “Oh, vitamin E again—we covered vitamin E a couple of weeks ago and vitamin E is not news this week. It’s old news.”

So, let’s go beyond the sensationalistic headlines and look at the science—or should we say lack of science—behind the report. The editorial accompanying the study, which was later published in the January 4, 2005 issue of the Annals of Internal Medicine, cautioned “Thus, while Miller and colleagues’ report provides intriguing evidence suggesting that higher doses of vitamin E cause death, the case is not ironclad.” The authors themselves cautioned, “The generalizability of the findings to healthy adults is uncertain”

For this column, we called upon two leading vitamin E experts, Drs. Maret Traber and Jeffrey Blumberg. We also consulted with meta-analysis expert, Dr. Kent DeZee, for clarification of some issues concerning statistical techniques in clinical studies.

Jeffrey B. Blumberg, Ph.D is professor, Friedman School of Nutrition Science and Policy, associate director and chief of Antioxidants Research Laboratory and senior scientist, Jean Mayer USDA Human Nutrition Research Center on Aging at Tufts University, Boston, MA. Dr. Blumberg has been widely respected as an antioxidant nutrient expert for decades. Readers may remember that he chatted with us in August 2000 concerning the newly established dietary recommendations for antioxidants including vitamin E (http://www.drpasswater.com/nutrition_library/antioxidants_2.html).

Major Kent DeZee, M.D. is a general internist at the Walter Reed Army Medical Center in Washington, DC. He has been involved in humanitarian medicine, helping in various countries including Cambodia and Chuuk atoll in the Federated States of Micronesia. Of special interest to this topic is that Dr. DeZee teaches Meta-analysis 101 for the Society of General Internal Medicine.

Maret G. Traber, Ph.D. is a principal investigator in the Linus Pauling Institute and Professor in the Department of Nutrition and Exercise Sciences at Oregon State University in Corvallis, OR. With over 150 scientific publications and more than 20 years experience with vitamin E, Dr. Traber is considered one of the world’s leading experts on vitamin E. Her research focuses on vitamin E function and metabolism in humans. Dr. Traber currently serves on the editorial boards of the Journal of Nutrition and Free Radical Biology & Medicine, and is associate editor of Lipids. She served on the National Academy of Science, Institute of Medicine Panel on Dietary Antioxidants that set the dietary requirements for antioxidant vitamins in 2000.

One of Dr. Traber’s particular fields of expertise concerning vitamin E is the protein that transports and transfers vitamin E in the body. Readers may remember her three part series on vitamin E in this column in 1997 (also http://www.drpasswater.com/nutrition_library/traber1.html).

           

Passwater and Passwater: The report by Dr. Edgar R. Miller, III, and his colleagues has caused considerable concern among those who routinely take vitamin E supplements. This is largely because of the “spin” put on the findings of the report by the media, more so than the report itself. Let’s get to the bottom line right away. Does this meta-analysis provide credible evidence that vitamin E is harmful in dosages above 400 IU?

Blumberg: No. It is important to appreciate that the meta-analysis is not a new clinical trial, but an analysis made by lumping together a relatively few clinical trials involving patients sick with chronic diseases and with high risk of death to build a bigger data base to see if the bigger data base yields results having statistical significance where there was none before. The study did not investigate at all dozens of observational studies involving millions of people that show vitamin E supplementation can be beneficial and completely safe.

DeZee: I have trouble accepting the conclusion regarding high dose vitamin E due to the statistical methods used and the lack of controlling for study quality and publication or selection bias. I suggest that correction of any one of these factors could negate the marginally significant results. With these problems and multiple other studies suggesting vitamin E has no effect on mortality, telling our patients that it may be harmful seems premature.

Traber: No, it doesn’t. It flies in the face of decades of nutrition research.

Passwater and Passwater: Dr. Blumberg, what does the public need to understand about this report?

Blumberg: They need to know that the totality of experimental, observational and clinical studies on vitamin E have been evaluated by the Institute of Medicine, which concluded that this vitamin is safe and absent any potential harm at doses of 1100 IU (synthetic vitamin E or more precisely the d,l-alpha tocopheryl acetate form of vitamin E) and 1500 IU (natural source vitamin E form or more precisely the d-alpha tocopheryl acetate form of vitamin E). The Dietary Reference Intake (DRI) panel has established these values as the Tolerable Upper Intake Level (UL), and I consider these values very conservative. The Institute of Medicine defines the UL as the highest level of daily nutrient intake that is likely to pose no risk of adverse health effects to almost all individuals in the general population. As intake increases above the UL, the risk of adverse effects increases. The UL is not meant to apply to individuals who are being treated with the nutrient under medical supervision.

Science is not based on a single study for many reasons. One should never take a single study out of the context of the entire body of knowledge.

Passwater and Passwater: Dr. Traber, you were a member of the Institute of Medicine’s Panel on Dietary Antioxidants and Related Compounds that established the DRI for vitamin E and evaluated the safety of vitamin E. Is vitamin E safe?

Traber: Yes, and I agree with Dr. Blumberg that it is important to look at the totality of the evidence that vitamin E does no harm. This includes in vitro data, animal studies, epidemiological studies and clinical trials. This is where the media blitz concerning the meta-analysis was completely out of proportion. The authors of the study could offer no scientific reason as to why vitamin E increased mortality. Given the thorough search of the literature by the DRI committee, and the lack of evidence for adverse vitamin E effects in humans or other animals, it is difficult to rationalize how vitamin E might cause increased death.

One of the difficulties the panel had in setting the upper limit for vitamin E was the relative paucity of adverse effects of vitamin E in humans. We had to rely on studies done in rodents to find any adverse effects, and these adverse effects were an increase in a tendency to bleed.

Our readers can read the 97-page DRI report on vitamin E free on the web at

http://books.nap.edu/catalog/9810.html and for more information of vitamin E, they will find “Vitamin E: function and metabolism” (FASEB J. 13: 1145-1155; 1999) by Dr. R. Brigelius-Flohé and myself, a good place to start.

Blumberg: Readers should also keep in mind that the National Institutes of Health is currently supporting several on-going clinical trials with vitamin E at 400 IU and above and has indicated no intention to stop these investigations. However, unfortunately, I understand that many people are dropping out of these very important studies out of fear that vitamin E is more likely to kill them than help them.

Passwater and Passwater: Dr. DeZee, why are we seeing so many meta-analyses these days?

DeZee: A meta-analysis is a technique to combine the results of all similarly designed research reports on a narrow, focused topic. There are some situations where this method is particularly useful.

One of them is to combine the results of all similar studies with a rare outcome. When outcomes are rare, such as death, a researcher must follow many, many patients, at great expense, to determine if there is a statistically significant result. In most cases, though, these large trials don’t exist, but there are small trials that have already been done. If one can combine the experiences of these small trials, then one wouldn’t need to spend the time or money to do the very large trial, yet still come up with an acceptable answer.

Of course, there are rules that one must follow. First and foremost, the researcher must avoid bias. Bias is what occurs when a researcher systematically affects the results of his study, consciously or unconsciously, by inserting his own beliefs or flawed methods. To avoid bias, the researcher must exhaustively look for all the studies on the topic and develop fair and reasonable rules to decide which studies to keep and which studies to exclude. All of the included studies should be done the same way (e.g. all placebo-controlled trials) and should have studied the same basic topic in similar types of patients.

The studies need not be exactly the same, just not obviously different. As an example, it would be wrong to combine the results of 10 trials of vitamin E on healthy, 30-year-old patients with one trial of old, high-risk diabetic patients.

Passwater and Passwater: Dr. Traber, what is the intended purpose of such meta “studies?”

Traber: Meta-analysis is a statistical tool to group the results from several trials to generate new hypotheses. This is not a technique to prove the hypothesis. Initially these kinds of mathematical tools were used much like the polls around Election Day. That is, outcomes from one city might not tell you the results, but if you choose a random sampling from several representative cities, then you could better predict the outcome.

It is very expensive to carry out a clinical trial, and the larger the trial the less information can be learned about each subject. Therefore, the meta-analysis allows the investigator to combine the results from several trials so that the number of subjects studied can be very large, e.g. 100,000 people.

Passwater and Passwater: Dr. DeZee, what are the limitations of meta-analysis?

DeZee: Meta-analysis is subject to a number of limitations. We have already discussed the possibility of researcher bias.

Another example of limitation of meta-analysis is called publication bias. Not all research is published. This is particularly true of small trials that do not have a statistically significant result, as journal editors tend not to publish these. For the Miller study, we think (by statistical tests we did) there might have been small studies of vitamin E that had no effect on mortality that were never published. If these theoretically missing trials could be found and included in the meta-analysis, they would tend to shift the results away from harm. This is particularly a problem when the results are only achieving statistical significance by a small margin, such as in the article by Miller. Even a few unpublished studies could change the results from one suggesting harm to one suggesting neither harm nor benefit.

There are other limitations. The summary is only as good as the ingredients, so one should comment on the quality of the included studies and determine if high-quality studies showed different results than low-quality studies. Low-quality studies tend to overestimate the results. There are also certain rules for the statistical methods used to combine the studies.

The bottom line with the limitations of meta-analysis is that one should be able to read a meta-analysis and say, “I believe the results because I think the authors did a thorough search on a narrow, focused topic, found all of the studies on this topic, and had clear, reasonable guidelines for the studies they included. They reported all meaningful outcomes, used appropriate statistical methods, and looked for other aspects of the trials that might explain the results.”

Passwater and Passwater: Dr. Blumberg, are there serious limitations in the Miller et al. meta-analysis of vitamin E supplementation and all-cause mortality (ACM) trial?

Blumberg: Their meta-analysis is limited in so many ways it is not possible to quickly reiterate them all, but extrapolating results from secondary prevention randomized clinical trials (and their polypharmacy regimens) to effects in healthy populations is inappropriate. 

A serious flaw is basing the analysis on “All Cause Mortality” (ACM). This can include accidents as well as cancer. It is very difficult to understand the apparent adverse effect (ACM) when the authors do not even mention what it was that people were dying of. Did the causes of death have a biologically plausible mechanism related to vitamin E (e.g., death by homicide or auto accident)? Use of combined endpoints for an outcome can be suspect: were the authors unable to identify any statistically significant relationship with specific causes of mortality in such a tiny number of people so that they had to combine totally unrelated events (e.g., cancer and drowning)?

Another issue is that this selected approach to meta-analysis excludes consideration of all the experimental and observational studies showing benefits of vitamin E supplementation. It is not clear why Miller et al. chose to exclude from their analysis studies for which there were less than 10 deaths as the goal of meta-analysis is to combine as many studies as possible to increase sample size and heterogeneity in outcome. Further, the authors excluded other studies based on arbitrary criteria and omitted two studies that clearly showed the benefit of combined natural-source vitamin E and vitamin C supplementation, the ASAP and TAA studies. 

Passwater and Passwater: All of these are interesting points; in a moment, let’s consider the one dealing with the exclusion of studies in which there were no reported deaths. That certainly seems to be a major flaw. But first, we want to go to a more basic question, Dr. DeZee; did the authors use the appropriate statistical model for this meta-analysis?

DeZee: My colleagues and I at Walter Reed were interested in the implications of the Miller report and how it might impact on our patients, so we did our own statistical analysis.

It was not clear to us why the Miller group chose hierarchal logistic regression rather than traditional meta-analytic approaches. We reanalyzed their data from the 11 high-dose trials in their Figure 2 using two standard methods (Wolfe inverse variance and Mantel Haenszel). Both of these standard statistical methods yielded the same point estimates as the Miller group, but our standard statistical methods found that the results were statistically insignificant (RR: 1.04, 95% CI: 0.99-1.10). Secondly, we are concerned that these results are heterogeneous. Both of the standard statistical methods we used showed that the results might still contain heterogeneity. That is, there very well could be something about these trials that make them too different to combine together. The authors of the study should either search for more explanations of these differences or use another statistical technique to combine the results that takes these differences into account. We have asked for more information from the report authors via a letter to the editor of Annals of Internal Medicine.

Passwater and Passwater: We’ll refer our readers to your 35-page syllabus for your Society of General Internal Medicine course at www.sgim.org/Handouts/am04/Precourses/PA04.pdf for explanations of these procedures. Dr. DeZee, do you and your colleagues have other “statistical” concerns?

DeZee: We also found a suggestion of publication bias among the 11 high-dose trials with Begg’s test (p=0.073), which we confirmed with the trim and fill method. The Miller group suggests that studies showing benefit from vitamin E are unlikely to be missing from the literature. However, small studies that demonstrate no effect could well be unpublished. It is also possible that the authors did not include these small trials because they did not search EMBASE, which may have excluded European trials. As Dr. Blumberg noted, the meta-study also excluded trials with less than 10 deaths (which seems arbitrary) that, in particular, would tend to bias the results towards a finding of harm. This is a serious bias.

The Miller group searched for the influence of each trial and determined that “none seemed to be driving the results”. In our reanalysis, exclusion of the largest trial (MRC/BHF HPS) resulted in a wider confidence interval (95% CI: 0.96-1.13). We would like to know if the results became non-significant when the Miller group excluded this or other trials.

Also, the Miller group did not account for study quality as a possible explanation of the results. This has been previously shown to affect the results of randomized controlled trials and is recommended by the QUOROM statement.

Passwater and Passwater: Dr. Traber, how statistically significant are the numbers.

Traber: The strength of their statistical claims are not very strong. In general, a good meta-analysis has p-values less than 0.01; none of the p-values cited in the paper were that strong.

Passwater and Passwater: Now let’s return to the point raised by both Dr. Blumberg and Dr. DeZee concerning that the meta-analysis included only studies that reported at least 10 deaths during or after the time of the study. The Miller group stated that of 36 identified trials to consider studying, 12 of them were excluded as they reported fewer than 10 deaths. This seems to be a very large bias. So, if vitamin E was protective and prevented deaths, then that study wasn’t included. Dr. Traber do you also feel that excluding these trial could impact on the results?

Traber: Yes, what do these excluded studies tell us? The Miller group appears to be testing only the question of whether there is an effect of vitamin E on increasing mortality. A better question to test is “is there any effect of vitamin E on mortality.” They didn’t test the effect of vitamin E on mortality because they excluded studies where fewer than 10 people died. But, if vitamin E does not cause increased mortality, then these are important studies and should not be overlooked.

DeZee: This appears to be an arbitrary decision by the Miller group. This could introduce two possible biases. First, they may have excluded trials that showed vitamin E reduced death; second, they have included only trials that have at least 10 deaths. Either or both of these decisions may bias the results toward harm.

Another serious concern is that the Miller group did not specifically search the European research (the search engines they used would get some, but not all, European trials). Many trials of supplements appear in the German literature.

Blumberg: Miller et al. also do not place their results in the context of benefit and risk, only risk. They do not mention the benefits found in several of the studies they included in their own meta-analysis (e.g., reduction in risk of Alzheimer’s disease, some forms of cancer, AMD, and CVD). How can one assess risk if benefit is not included in the equation?

Passwater and Passwater: The study looked at different types of research studies that used different protocols and procedures such as different doses of vitamin E taken for different lengths of time. Doesn’t this make it a matter of comparing apples to oranges? What critical confounding factors are not compensated for in this meta-study?

Traber: The idea of a meta-analysis is that each study that is included is properly balanced and various confounders cancel each other out. Again, studies with larger populations of sick, elderly, frail subjects had greater mortality than those with younger, stronger subjects. Importantly, the largest studies individually, had no increased risk in all cause mortality. The question then is “Is the meta-analysis a statistical manipulation of the data or is it a real harbinger of serious problems?” And here I think it is important to look at in vitro data, animal studies, human observational studies and intervention trials—the totality of evidence—when seeking to determine if vitamin E does harm.

Passwater and Passwater: Vitamin E was often used in combination with pharmaceutical drugs being studied; the effects of these combinations were not discussed in the research.

Blumberg: That’s another interesting question that suggest further research. Also, in the 11 studies in which they suggested harm from high-dose vitamin E supplementation, please note that in five of these, vitamin E was used along with other nutrients, but they attribute ACM outcomes to vitamin E alone. This is fuzzy thinking and not valid. They also combine results from studies with synthetic and natural-source vitamin E as though the two forms of vitamin E are equivalent.

Passwater and Passwater: There are other questions about the meta-analysis that concern us, but you have already given us and our readers enough scientific evaluation that they can now look more closely at what the meta-analysis shows, no longer forced to rely solely on what the media reported.

Readers may keep informed on this issue by periodically visiting the following websites; Dietary Supplement Information Bureau (www.vitaminEfacts.org), National Nutritional Foods Association (www.nnfa.cor/vitamine.htm), and Council for Responsible Nutrition (www.crnusa.org/vitaminEmetanalysis.html).

Thank you, Drs. Blumberg, DeZee and Traber, for sharing your knowledge and time with us. WF

About the Authors

Richard A. Passwater, Ph.D., who has offices in Berlin, MD, has authored several books on nutrition, including The New Supernutrition, Cancer Prevention and Nutritional Therapies, Selenium Against Cancer and AIDS, and Selenium as Food and Medicine. For more information on the topic covered in this column, and/or other news and developments in the field of nutrition, visit the columnist at his website—www.dr.passwater.com.

Richard Passwater, Jr. is president of RP Research & Communication, a consulting firm located in Ocean Pines, MD.

© 2005 Whole Foods Magazine and Richard A. Passwater, Ph.D.

This article is copyrighted and may not be re-produced in any form (including electronic) without the written permission of the copyright owners.