Meeting 5 challenges in creating targeted molecular therapies for cancer
If there were doubts about the promise of targeted molecular therapies for cancer, they've been put to rest by success stories such as epidermal growth factor receptor inhibitors in lung cancer and RAF inhibitors in melanoma, says William Sellers, MD, vice president of oncology research at Novartis Institutes for Biomedical Research.
Sellers, who was appointed to the National Cancer Advisory Board this year, outlines 5 significant challenges in exploiting genomic data to develop novel therapies: acquiring sufficient genetic data, finding new kinds of druggable targets, overcoming drug resistance, effectively combining treatments, and improving preclinical translational systems. He gave Cancer Discovery's Eric Bender some perspectives.
What's the challenge now in acquiring genomic data?
We need to have a complete picture of the genetics of all cancers. The genome communities will largely solve this problem within the next 5 years, though I think that sequencing needs to be done at a resolution that people still haven't understood. At this point, we're mostly looking at describing the single gene frequencies. The second stage is to understand what I would call the genetic interaction map. People may not understand how many samples will need to be sequenced in order to really understand the relationship between mutational events.
How will we expand the range of druggable targets?
I roughly divide that problem into 2 parts. One is the difficulty of drugging oncogenes that are in this sort of undruggable class, like MYC and RAS and others that don't have a kinase catalytic domain. There needs to be some significant innovation in what we consider a druggable target. It would be nice if some efforts in academia, rather than focusing on kinases or mechanisms that are already being well investigated by industry, could study this.
The second part of that problem is taking on tumor suppressor genes. Of course, replacing function is much harder than turning off function from a drug discovery point of view. One example where the deletion of a tumor suppressor has predicted for sensitivity to a drug is in the pediatric illness tuberous sclerosis, which is caused by germline mutations in the tumor suppressor genes TSC1 and TSC2. The Novartis drug Afinitor has had a very significant beneficial effect; essentially these genes deregulate the target of the drug.
We need to exploit that paradigm more fully. Generally our approach, along with many other groups, is to try to find druggable pathway nodes that are linked up to tumor suppressor genetics through the concept of synthetic lethality.
How will we address resistance to targeted therapies?
There's no question that cancers will develop resistance to single molecular therapies. The question is whether this is an inevitable hopeless situation where the diversity of resistance mechanisms is so profound and complex that one could never make any progress.
What's intriguing is that if you look at the situations that we know the best so far, in the targeted therapeutic area, in most cases the mechanism of resistance reactivates the original pathway that was activated by mutations in the first place. In the case of BCR-ABL, the mutations that block the binding of Gleevec to ABL are the most prevalent form of therapeutic resistance. I think this is quite amazing. The second-generation effective therapeutics in that disease are all attacking BCR-ABL again.
Unlike chemotherapy resistance, in the targeted area you can define genetic model systems in which the mechanisms of resistance are often the same that you see in humans. That gives us great hope that we can study these mechanisms very early in the clinical development process. For example, before our Smoothened inhibitor (for Ptch mutations in pediatric medulloblastoma and adult basal cell carcinoma) even went into the clinic, we had defined mechanisms of resistance and started to understand drug combinations that might prevent resistance.
What are the challenges in such combination therapies?
One is that we obviously need to test combinations much earlier in the development process. In the preclinical arena we are trying some high-throughput combination screens, but even those are difficult given the scale needed for an unbiased, systematic approach. Right now we are driven mainly by hypothesis-based combinations. Hopefully that will still lead us away from an older model of simply adding a new drug to the standard of care, which is not very satisfying, especially when the standard of care is not particularly effective.
You're also calling for upgrades in preclinical translational infrastructure. What do those entail?
We need a preclinical translational infrastructure with model systems that reflect what will happen in a human.
In our world we break that down into 3 areas. Our cell line encyclopedia project is a collaboration with the Broad Institute to build a library of 1,000 cells lines completely characterized at the genetic and RNA level. Second, internally we're building a primary human tumor bank with human tumors directly implanted in mice, never grown on culture dishes to avoid some of the downsides of culture on plastic surfaces. Third, we're initiating what we call a mouse allograft tumor program, with tumors from genetically engineered mice that are transplanted serially as well.
All of these approaches need to be brought to bear, and the tumors need to be characterized at greater detail to understand how they reflect human cancers.