Cancer’s Big Data Problem
October 20, 2016
(Medical Xpress) – Data is pouring into the hands of cancer researchers, thanks to improvements in imaging, models and understanding of genetics. Today the data from a single patient’s tumor in a clinical trial can add up to one terabyte—the equivalent of 130,000 books. But we don’t yet have the tools to efficiently process the mountain of genetic data to make more precise predictions for therapy. And it’s needed: treating cancer remains a complex moving target. We can’t yet say precisely how a specific tumor will react to any given drug, and as a patient is treated, cancer cells can continue to evolve, making the initial therapy less effective.
The views, opinions and positions expressed by these authors and blogs are theirs and do not necessarily represent that of the Bioethics Research Library and Kennedy Institute of Ethics or Georgetown University.