Do vitamins make you healthier? Two recent studies disagree on the answer to that. One says vitamins help prevent cancer. And a new one out Monday says they do not help prevent heart attacks or strokes in men. We discuss whether vitamins are crucial to good health. Are supplement-happy Americans going too far?
Vitamins: Should You Take Them?
Duffy Mackay N.D., vice president for scientific and regulatory affairs at the Council for Responsible Nutrition
J. Michael Gaziano, chief of the division of aging, Brigham and Women's Hospital and Veterans Administration Healthcare System
Jose Luis Mosquera, medical consultant and integrative medicine expert