Artificial Intelligence (AI)
Discuss current events in AI and technological innovations with Intel® employees
544 Discussions

Easily Develop and Deploy AI Applications with New Services on the Intel® Tiber™ Developer Cloud

Sonya_Wach
Employee
0 0 17.3K

The Intel® Tiber™ Developer Cloud is a popular platform amongst developers and enterprises to develop and deploy AI/ML applications with ease and at low costs, as seen with these companies currently utilizing the platform to build out their solutions. Intel Tiber Developer Cloud accelerates AI development using Intel-optimized software on the latest Intel hardware. Developers are able to achieve high performance gains at significant cost reductions compared to on-premises environments, including utilizing ready-to-use optimizations and tools such as oneAPI. Developing generative AI (Gen AI), computer vision, or other machine learning applications with improved efficiency is easy on the platform with services such as training tutorials and workshops to learn how to develop and optimize models and workloads.


The Intel Tiber Developer Cloud is constantly updated with new releases that include new features and services to better meet developer needs. Highlighted below are two new services that help developers develop, deploy, and scale their AI applications more easily, allowing more resources to innovate further.

Intel Kubernetes* Service

New features on the Intel Tiber Developer Cloud include the addition of Intel Kubernetes Service, a fully managed container service that helps customers run GPU-accelerated Kubernetes workloads at scale. This service enables developers to utilize Intel Max Series GPU and Intel® Gaudi® 2 AI Accelerators, in addition to Intel® Xeon® Scalable processors for broader customized configurations as part of a Kubernetes runtime environment for AI/ML training and inference. The service allows developers to manage their K8 models, applications, and services with ease as the K8s complexity is hidden behind menu-driven templates. This architecture allows customers to easily spin-up clusters and deploy workloads without impact to infrastructure, making scaling effortless. The service is built on a oneAPI foundation with optimizations such as the Intel Extension for PyTorch* and the Intel Extension for TensorFlow* to improve AI model performance. Software acceleration tools like the Intel OpenVINO and the Intel Neural Compressor are also included to enhance application performance.


The Intel Kubernetes Service allows developers to evaluate multiple system configurations to see what works best for their AI/ML solutions at a lower cost than other cloud options. Currently, the Intel Kubernetes Service is only available to premium and enterprise account users on the platform. Info on how to use the Intel Kubernetes Service on the Intel Tiber Developer Cloud, including how to create and launch Kubernetes clusters, can be found here.

Storage as a Service

Storage as a Service (STaaS) has also been released on the Intel Tiber Developer Cloud, allowing Standard, Premium, and Enterprise account holders to create a storage volume for file storage. The storage functionality simplifies AI model training and inference by storing large amounts of unstructured data to improve model accuracy. File-system data storage can even be mounted and shared for use across multiple instances. Using the file-system storage service allows developers to easily access data and scale their applications accordingly, in addition to improving model latency and throughput.


A guide on storage as a service and how to create, mount, and delete the storage can be found here.

 

The Intel Tiber Developer Cloud includes many tools and services to meet your needs, including access to Jupyter notebooks to try out the latest AI hardware and optimized software for hardware evaluation, model development, and education. Check out how to get started on Intel Tiber Developer Cloud, as well as Intel’s AI tools and framework optimizations and the unified, open, standards-based oneAPI programming model that forms the foundation of Intel’s AI software portfolio. Also discover how our other collaborations with industry-leading independent software vendors (ISV), system integrators (SI), original equipment manufacturers (OEM), and enterprise users accelerate AI adoption.

Additional Resources

Intel Extension for PyTorch*

Extension for TensorFlow*

Intel Neural Compressor

About the Author
AI/ML Technical Software Product Marketing Manager at Intel. MBA, Engineer, and previous two-time startup founder with a passion for all things AI and new tech.