Thursday, May 12, 2011

The Truth About Vaccines

So, we are all taught that the medical establishment introduced vaccines, and thereby saved the world from certain destruction. However, the truth seems to be that diseases were already almost entirely eradicated before vaccines were ever introduced, and when the diseases finished disappearing, after vaccines came in the scene, the medical industry claimed the credit.

Below are some charts I got from Mothering magazine that illustrate this point: