19 Companies Pioneering AI Foundation Models in Pharma and Biotech
Foundation models represent a new paradigm in artificial intelligence (AI), revolutionizing how machine learning models are developed and deployed. As these models grow increasingly capable, they become useful for applications across a wide range of economic functions and industries, including biotech. Foundation models are a class of large-scale machine learning models, typically based on deep learning architectures such as transformers, that are trained on massive datasets encompassing diverse types of data. The most prominent examples of general-purpose foundation models are the GPT-3 and GPT-4 models, which form the basis of ChatGPT, and BERT, or Bidirectional Encoder Representations from Transformers. These are gigantic models trained on enormous volumes of data, often in a self-supervised or unsupervised manner (without the need for labeled data).
Their scalability in terms of both model size and data volume enables them to capture intricate patterns and dependencies within the data. The pre-training phase of foundation models imparts them with a broad knowledge base, making them highly efficient in few-shot or zero-shot learning scenarios where minimal labeled data is available for specific tasks.
This approach demonstrates their high versatility and transfer learning capabilities, adapting to the nuances of particular challenges through additional training.
Below we summarized a number of companies building domain-specific foundation models for biology research and related areas, like chemistry.
Topics: AI & Digital