Every year, lots of people become ill or even die because of natural supplements. If these are so called natural, then surely they are safe; how could they do harm? Well, natural substances like vitamins, minerals, and herbs all evoke change in the body. Anytime ANYTHING is ingested (food, drinks, supplements, drugs, etc), it impacts the physiology of the body. So why do so many get harmed by natural supplements? Mainly for two reasons.
One, people don't often report to their medical doctor that they are taking vitamin D, coenzyme Q10, or zinc capsules. These are just "supplements", so why report these to the doctor when he or she prescribes an actual drug. People don't report natural supplements to medical doctors because they tend to think that they are trivial or that they are scared that their MD will scold them for doing something "natural" for their health. Are all MDs against natural products? I certainly think not. I hope that patients and MDs become more open about natural products because these products can interact with prescription drugs.
The second problem is that most natural supplements are over-the-counter. At any drug store, health food stores, or in the health food sections of major grocery stores, you can purchase supplements without a prescription nor any guidance. And since supplements are so easy to purchase, people tend to think that they are harmless and can be taken just as any other food-like product. I wish that supplements were more regulated and I wish even more than non-prescription drugs (like pain relief medications, anti-allergy medications, anti-histamines, etc) were more regulated. Fewer side-effects of combining prescription drugs with non-prescription medications and supplements would occur overall.
Furthermore, I believe that everyone should have nutritional counseling. Eating well is not that complicated, it's just that it is never taught so the general public really doesn't know what's healthy and what's not. The only thing we learn about eating is what is passed down from our parents. Combined with all the mixed messages from the media (for example there are advertisements and commercials making health claims, and T.V. shows like Dr. Oz or Oprah, each preaching various health trends), nutrition gets even more confusing. Unless you do your own research on health, by reading books or websites, or seek out professional help, by seeing a dietician or naturopathic doctor, you'll never know what is healthy and you'll just keep eating what tastes good. Why-oh-why is the subject of "Nutrition" not taught in high schools?!
Thanks for reading my little rant. I'll try my best to not be so preachy or complain-y tomorrow. :P