by Craig Klugman, Ph.D.
Ebola burst onto the scene in 1976 when a thirty-old man arrived at the Yambuku Mission Hospital in Zaire complaining of severe diarrhea. He left the hospital two days afterwards and was never found again. In the days and weeks that followed, people who were patients or care providers at this facility when he was there all died after experiencing dehydration, fever, vomiting, diarrhea, and bleeding everywhere. The death rate was staggering, as over 80% of affected patients did not recover.
Since then, the CDC reports there have been 34 distinct outbreaks of the five strains of Ebola. Two strains do not infect humans. Epidemics have caused human deaths in Zaire, Sudan, Ivory Coast, Gabon, Congo, and Uganda. The latest outbreak began in March 2014 and has affected large swaths of Guinea, Liberia and Sierra Leone with more limited transmission in Nigeria, Madrid, and Dallas. Senegal has also reported a travel-associated case. The CDC estimates that there have been 8,400 cases of Ebola as of October 10 (4,656 are confirmed by lab tests) and 4,033 deaths.
Given that the world has known about the disease for 40 years, why is it only now that it has been getting such widespread attention and panic? The reason is that the disease is finally being seen as a risk to the developed world. When it was confined to a few hundred cases in remote regions among poor people, the West as a whole did not worry. The developed world did not devote adequate funding to finding the natural host (believed to be fruit bats, but not definitive), to develop treatments, or to create a vaccine.
The views, opinions and positions expressed by these authors and blogs are theirs and do not necessarily represent that of the Bioethics Research Library and Kennedy Institute of Ethics or Georgetown University.