Artificial Intelligence (AI)
Discuss current events in AI and technological innovations with Intel® employees
584 Discussions

Accelerating the Development and Use of Trustworthy Generative AI for Science and Engineering

Rob_Mueller-Albrecht
1 0 19.6K

Chandan Damannagari, Director, AI Software

Robert Mueller-Albrecht, Product Marketing Engineer

Intel Corporation

The Trillion Parameter Consortium

The Trillion Parameter Consortium (TPC) is a collaboration between academic and research institute partners, industry, and vendors, along with labs and HPC centers that host some of the world’s most powerful supercomputers. It is a broad “AI for Science” community that aims to work together to create models for science and engineering use cases, that are at the scale of the largest private models, by collaborating on datasets and model training while sharing compute cycles.  

Argonne National Laboratory and Intel have been working together on a Large Language Model (LLM) based AI Generative Pretrained Transformer (GPT) solution dubbed Aurora GPT, specifically for assisting with extremely high variable parameter count scientific problems running on the Intel® Xeon® Scalable Processor and Intel® Data Center GPU Max Series based  Aurora Exascale Supercomputer.

The intent is to “combine all the text, codes, specific scientific results, papers, into the model that science can use to speed up research,” as Ogi Brkic Vice President and General Manager, Super Computing Product Line at Intel put it. This collaboration soon expanded, and expressing the spirit of collaboration across thescience, artificial intelligence, and high-performance computing community, it led to the founding of the TPC on November 10, 2023. 

Join the European Kick-Off Workshop of the Trillion Parameter Consortium (TPC) 

Later this month, scientists from around the world will join a workshop discussing how to leverage the predictive power of Large Language Models (LLMs) across scientific domains. For 3 days they will work on  

  1. AI methods, natural language processing, multimodal approaches and architectures,  
  2. hardware and software systems, 
  3. uses and detailed implementations taking advantage of the resulting AI systems across science, engineering, medicine, and other domains.

 

TPC European Kick-Off Workshop 

 

It will be a forum for focused in-depth discussions driving towards a shared vision and goal, particularly aimed at accelerating the use of generative AI in science and engineering.   

The workshop program includes: 

  1. An optional half-day tutorial, led by instructors from University of Chicago and Argonne National Laboratory (ANL) on LLMs for Science and Engineering Applications
  2. Plenary sessions with invited talks covering 
    • Vision for Large-scale AI for Science 
    • Major National and International Funding Initiatives at the Nexus of AI and HPC 
    • Insights from Building Large-scale AI Models for Science 
    • Trustworthy and Responsible AI.  

At the heart of the workshop are parallel breakout sessions with some leading scientists in their fields, designed to explore and pursue collaborations, covering topics ranging from model architecture and performance to skills, safety, and trust evaluation to bioinformatics and treatment design. 

More on the Trillion Parameter Consortium 

 

Through the TPC, the minds of the global scientific community come together with a single purpose:   

To address the challenges of building large-scale artificial intelligence (AI) systems and advancing trustworthy and reliable AI for scientific discovery.  

The result is a thriving community of over 70 organizations and 500 participants in a common Slack workspace is engaged in creating large-scale generative AI models to address key challenges in advancing AI for science.  

This includes, but is not limited to: 

  • Developing scalable model architectures and training strategies 
  • Organizing and curating scientific data for training models 
  • Optimizing AI libraries for current and future exascale computing platforms 
  • Developing deep evaluation platforms to assess progress on scientific task learning and reliability and trust.   

33410D_036_01_CELS_Aurora Sunspot Photos.jpg

 

Today, Intel is excited to be an active contributor to the Trillion Parameter Consortium and its spirit of embracing collaboration, open standards, and AI solution scalability across multiarchitecture platforms including CPUs, GPUs, NPUs and specialized AI accelerators. 

We encourage you to learn more and become a contributor to defining the role of AI in Science and Engineering and be at the forefront of the discussions that help define the next generation of reliable, trustworthy, and powerful research powered by AI. 

Join the Trillion Parameter Consortium and become an active contributor on its TPC Slack Workspace

Additional Resources 

About the Author
Rob enables developers to streamline programming efforts across multiarchitecture compute devices for high performance applications taking advantage of Intel's family of development tools. He has extensive 20+ years of experience in technical consulting, software architecture and platform engineering working in IoT, edge, embedded software and hardware developer enabling.