Machine learning in drug discovery and molecular informatics was first used by the pharma industry to increase efficiencies approximately two decades ago. Deep learning (DL), which is an artificial neural network with multiple hidden layers, has proven to have a more flexible architecture, allowing it to create structures tailor-made for a specific problem. Applications of this advanced artificial intelligence (AI) capability have increased in the pharma industry in the past decade and the technology has demonstrated promise in tackling many problems in the drug discovery process. In pharma research for example, the technology can be applied to use cases such as predicting the chemical reactions between candidate compounds and target molecules, synthesis prediction, the generation of new chemical structures and biological image analysis.
In recent years, pharma and biotech have implemented strategies to take advantage of technology for their pipelines. At HealthXL, we have seen and discussed previously that in some instances, AI research can be overhyped; however, in this highly competitive industry turning a blind eye to novel tools that can speed up the research process could have negative implications for a pharma company, forcing them to lose a competitive advantage. This is why some of the largest pharmaceutical companies are turning to AI to support their research.
How are industry giants like Pfizer, Merck and AstraZeneca using deep learning to develop drugs? There have been several high profile deals, partnerships and collaborations in the past few years. Here we’ll explore some varied initiatives and applications that are indicative of the shift in the sector…….
Merck Releases Data for Crowdsourced Deep Learning
The 2012 Merck Molecular Activity Kaggle Challenge offered $40,000 for Kaggle’s data science community to outperform medicine discovery techniques used by pharma.
The challenge involved identifying the best statistical techniques for predicting biological activities of different molecules, both on- and off-target, given numerical descriptors generated from their chemical structures.
Participants were given 15 data sets for biologically relevant targets, each with chemical structure information for thousands of individual molecules. The goal of the competition was to predict the activity levels between molecules and targets to find a candidate molecule for further development that would be active toward its intended target and inactive toward targets that might cause side effects.
In all, 236 teams faced off over 60 days, with the winning team using deep learning algorithms running on GPUs to distill the numerous large and multidimensional Merck datasets into a series of reduced dimensionality images that clustered molecules similar in activity and in time. The result represented a 17% improvement over the industry standard, highlighting exciting new avenues for analytics and computer learning in pharma research.
Pfizer Collaborates with IBM Watson
In a significant collaboration with IBM Watson Health, Pfizer announced that they would utilize Watson for Drug Discovery to accelerate advances in their immuno-oncology research. Using Watson, their team is able to quickly analyse massive amounts of data from disparate sources to test hypotheses supporting the identification of new drug targets and making connections that could lead to combination medicines for immuno oncology.
“When researchers can more quickly uncover novel patterns and connections, they can accelerate discovery,” says Scott Spangler, PhD, Chief Data Scientist for Life Sciences at IBM Watson Health. “We believe this can lead to effective pharmaceuticals going to market and reaching patients sooner. Today, Watson for Drug Discovery is helping life sciences researchers to understand what is currently known with speed and scale, so that they can generate evidence-based hypotheses with greater confidence.”
Scott Spangler, PhD, Chief Data Scientist for Life Sciences at IBM Watson Health
Pharma Partnering with Startups to Fuel Innovation
Last year, AstraZeneca inked a research collaboration with Berg Health, a company using DL to screen biomarkers from patient data. The partnership focuses on finding and evaluating novel ways of treating Parkinson’s disease and other neurological disorders. Under the terms of the partnership, AstraZeneca will have the right to secure an exclusive license to any of the drug candidates coming out of the work.
The financial terms of the Berg Health/AstraZeneca deal were not disclosed; however, we have seen investments by Big Pharma in AI and DL for drug discovery surging in the past few years. Last year, Sanofi and Exscientia (who discover compounds via Bayesian models of ligand activity from drug discovery data, a flexible form of machine learning that can combine scientists intuitions with the features gleamed from data) signed a collaboration and license option deal worth up to €250 million to discover bispecific small-molecule drugs against metabolic diseases. Exscientia also signed a similar deal with GSK in July. Under the terms of the agreement, GSK will use Exscientia to discover novel and selective small molecules for up to ten disease-related targets. Achieving all their milestones in this deal would be worth £33 million. These are just some highlights of the many deals that pharma companies have recently made with different DL companies globally to improve the discovery of drugs and biomarkers.
DL is showing huge promise, but pharma has been down this road before with QSAR and other data led approaches. These historic methods of computer-aided drug design were highly dependent on manually created systems of describing molecules, which was time-consuming and limited in general applicability. Today, DL methods are able to learn their own bespoke descriptions of molecules from the massive amounts of data supplied to them, allowing patterns that are invisible to the manual systems to be discovered. The surge of activity in this space in 2017 marks a paradigm shift and can be seen as a sign of pharma’s growing interest in using supercomputers for drug discovery. It also reflects that the industry’s initial skepticism about AI is shifting into genuine interest, driven by the technology’s promise to address the industry’s most dominant problem, drug failure rate.
So what will the future of DL look like as it relates to research in the pharma industry? Companies like Netflix and Amazon adapted their entire business model and culture to take advantage of AI technology, ultimately leading to massive success. While the technology is still being tested in pharma, it is likely that companies will also need to adapt their research process significantly to optimally implement and reap the maximum potential benefits of this innovative technology. As we know, technological metamorphosis and change in this highly regulated industry is nowhere near straightforward. Change will require highly skilled and knowledgeable people at the helm, as well as the right external champions to lead the revolution to impactful outcomes.