How Pharmaceutical Industry Is Adopting Artificial Intelligence To Boost Drug Research

by Andrii Buvailo, PhD          Biopharma insight / White Papers And Industry Reports

Disclaimer: All opinions expressed by Contributors are their own and do not represent those of their employers, or BiopharmaTrend.com.
Contributors are fully responsible for assuring they own any required copyright for any content they submit to BiopharmaTrend.com. This website and its owners shall not be liable for neither information and content submitted for publication by Contributors, nor its accuracy.

   1376    Comments 0
Share:   Share in LinkedIn  Share in Reddit  Share in X  Share in Hacker News  Share in Facebook  Send by email

What is “artificial intelligence”?

Artificial intelligence (AI) is an interdisciplinary science concerned with building “intelligent agents”, the field which takes its origin from the 1950s. Intelligent agents are autonomous systems/programs which mimic human intellect, can “observe” an environment using sensors (or receive data inputs), and perform activity towards achieving specific goals using consequent actuators.

An important component of AI is machine learning (ML), that provides systems the ability to autonomously learn from data and improve outcomes over successive iterations without being explicitly programmed (unlike “if-then” type of computer programs).

A notable family of machine learning models is neural networks, and particularly -- deep neural networks (DNNs), leading to a widely marketed notion of deep learning (DL). Such networks somewhat resemble the human’s brain layout and therefore are believed to be the closest modeling framework to human-type intelligence. However, deep neural nets also have fundamental and technological limitations and require large amounts of data for training, in contrast to more traditional statistical models (RF, k-NNs, SVMs etc). The diversity of existing AI models and strategies leads to a notorious complexity of AI adoption, requiring deep and multifaceted understanding of the nuances of the specific tasks to be solved with AI, and the specifications of various AI models and algorithms.   

While AI is a very broad field, embracing numerous modeling paradigms, it is, however, deep neural nets that has been primarily responsible for the majority of latest and most popular breakthroughs in image/video processing, natural language processing, gaming, various pattern recognition applications, and drug design.

For example, Convolutional Neural Nets (CNNs) were shown to be surprisingly effective at solving image processing tasks, while Recursive Neural Nets with LSTM were behind earlier progress in sequence learning, sequence translation (seq2seq), which also resulted in amazing results in speech to text comprehension and the raise of Siri, Cortana, Google voice assistant, Alexa, etc.

Recently, pharmaceutical and biotech companies and academic institutions have been demonstrating a vivid interest in AI applications for various R&D and operational needs in the area of drug discovery, clinical trials, translational science, biomedical research and pharmacovigilance. This surge of interest is inspired and driven, in part, by profound  advances in neural net architectures (2012 -- Alex Net wins ImageNet competition; 2014 -- Generative Adversarial Network (GAN) architecture is introduced), and by illustrative and widely publicized practical achievements of AI in various fields and industries, including:

  • learning to play complex games (like Chess and Go) at human level and above,
  • recognizing speech and text, synthesizing language 
  • revealing complex behavioral patterns (antimalware systems, Youtube prediction algorithms), 
  • understanding and categorizing objects in images and videos (Facebook face recognition, video surveillance)
  • creating ultra-realistic images and videos (e.g. “deep fakes”), 
  • powering driving cars (e.g. Tesla), 
  • powering robots to perform complex tasks (e.g. Boston Dynamics robots),
  • military applications; the list goes on. 

Another driver is the substantial progress over the past 10-20 years in auxiliary technologies (“AI enabling technologies”), such as computing power, data storage, ML-compatible hardware chips, and public cloud infrastructures and cloud-based services.

Continue reading

This content available exclusively for BPT Mebmers

Topics: Industry Trends    Biotech Companies   

Share:   Share in LinkedIn  Share in Reddit  Share in X  Share in Hacker News  Share in Facebook  Send by email

Comments:

There are no comments yet. You can be the first.

Leave a Reply

Your email address will not be published. Required fields are marked *