Artificial Intelligence (AI)
Discuss current events in AI and technological innovations with Intel® employees
494 Discussions

Increasing Performance of AI Applications on Microsoft Azure with Intel® Cloud Optimization Modules

Sonya_Wach
Employee
0 0 12.3K

Developing and deploying intensive artificial intelligent applications on the cloud offers cost-saving benefits and ease of scalability. Microsoft Azure*, a popular cloud computing platform, provides a wide variety of cloud services, including specialized platforms and applications for AI development. Azure AI offers a portfolio of AI tools, APIs, and models to help modernize and optimize AI and machine learning projects.

Further optimizations to model efficiency in development can be achieved by implementing pre-built optimizations and tools for a variety of applications on cloud platforms like Microsoft Azure. However, finding and implementing these tools and optimizations can often be time-consuming and resource-intensive for developers. By gaining access to extensive guides and documentation in a localized and open-source environment, developers can overcome the pain of adding new architectures to their code and easily increase their models' performance.

What are Intel® Cloud Optimization Modules?

The Intel® Cloud Optimization Modules are open-source codebases with codified Intel AI software optimizations built with production AI developers in mind. The modules offer a suite of cloud-native reference architectures to maximize further the potential of cloud-based solutions that integrate seamlessly with AI workloads. Developers can implement these optimizing solutions to increase the efficiency of their workloads for optimal performance on Intel CPU and GPU technologies.

Cloud optimization modules are available for popular cloud platforms like Microsoft Azure. Included in these modules are specifically built tools that complement and enhance the cloud experience on Azure with relevant codified Intel AI software optimizations. These modules facilitate creating and deploying highly available and scalable AI applications on Microsoft Azure. The optimizations provided by these tools offer many key benefits in powering AI solutions with end-to-end AI software and optimizations for a variety of use cases, from computer vision to natural language processing and more.

An open-source GitHub* repository is included with each module’s content package with all the relevant documentation: a whitepaper with more information on the module and what it relates to, a cheat sheet that highlights the most relevant code for each module, and a video series and hands-on walkthroughs on how to implement the architectures. There’s also an option to attend office hours for any specific implementation questions that you may have.

Intel Cloud Optimization Modules for Azure

Intel Cloud Optimization Modules are available for Microsoft Azure, including optimizations for Kubernetes and Kubeflow pipelines. You can learn more about these optimization modules available for Azure below:

XGBoost Pipeline on Kubernetes

Azure Kubernetes Services* (AKS) is a popular Azure tool that simplifies the deployment, scaling, and management of containerized applications. AKS implements overhead offloading, node management, security monitoring, and other features to improve application functionality. This module demonstrates how to build scalable machine learning pipelines on Kubernetes, including configuring Microsoft Azure Cloud Services* and an AKS cluster with a confidential computing node pool to implement a full end-to-end machine learning pipeline. The module enables the use of Intel® Optimization for XGBoost* and the Intel® Extension for Scikit-Learn* to optimize and accelerate data processing, model training, and inference as well as oneAPI Data Analytics Library in a full end-to-end reference architecture. Through this module, you can also learn how to use incremental training of the XGBoost model as new data becomes available.

XGBoost on Kubeflow Pipeline

Kubeflow, an open-source project to make machine learning workflows on Kubernetes deployments simple and scalable, promotes a consistent environment across testing, control, and production of high-scale production deployments of machine learning models. This module teaches you how to maximize the performance and productivity of XGBoost with a loan default prediction problem on an accelerated Kubeflow pipeline. The module utilizes the Intel Optimization for XGBoost on an Azure confidential computing cluster to implement an end-to-end machine learning pipeline from data preprocessing to model inference with oneAPI software optimizations.

Enhance your AI projects on Azure with Intel Cloud Optimization Modules by utilizing Intel optimizations and containers for popular tools. You can learn how to build accelerated models on your favorite Microsoft Azure tools and services and implement powerful software optimizations to take your projects to the next level. Unlock the potential of your Azure projects through these modules, and sign up for office hours if you have any questions about your implementation!

We encourage you to check out Intel’s other AI Tools and Framework optimizations and learn about the unified, open, standards-based oneAPI programming model that forms the foundation of Intel’s AI Software Portfolio. Also, check out the Intel Developer Cloud to try out the latest AI hardware and optimized software to help develop and deploy your next innovative AI projects!

 

About the Author
AI/ML Technical Software Product Marketing Manager at Intel. MBA, Engineer, and previous two-time startup founder with a passion for all things AI and new tech.