Data Center
Participate in insightful discussions regarding Data Center topics
45 Discussions

Taking IPUs Beyond Virtualized Cloud

4 0 2,176

The exceptional efficiency and flexibility of Intel® Infrastructure Processing Units (IPUs) is igniting wider adoption and unlocking more possibilities. In my last blog, I talked about Intel® IPUs revolutionizing how cloud service providers manage their infrastructure and signal the beginnings of a technological transformation. This transformation is gaining momentum enough that ABI Research recognized Intel as the overall leader of intelligent accelerator providers among industry leaders who design and implement IPUs, SmartNICs and data processing units (DPUs).

Today, IPUs are primarily being used for virtualized cloud use cases. This is where we’re seeing six out of the eight hyperescalers using IPUs for their virtualized cloud services, like Google Cloud and their latest H3 virtual machine instance. Although hyperscalers are taking advantage of IPUs, the many benefits can extend beyond hyperscalers. Since my previous blog, we’ve been observing other growing use cases for IPUs, and here, I’d like to share some of them as we look toward the future where an ecosystem is available to extend the benefits of IPUs beyond hyperscalers, and IPUs are more available to various market segments.


 Emerging use cases for IPUs

Network appliances

Network appliances are designed to perform specific network functions or services within a network. From load balancers, proxy servers, to network security appliances, IPUs can make network appliances more efficient and adaptable to the evolving demands of networks.

Ethernet smart switches, a type of network appliance, provide advanced control and management features like port security, access control lists, remote management, link aggregation and more. IPUs can particularly improve Ethernet smart switches by enhancing performance, security and advanced network features. IPUs can provide a significant increase in performance at lower power and cost. They can also reduce network traffic and improve total cost of ownership (TCO).

Storage target, disaggregated storage and storage initiators

In storage target use cases where data is intended to be stored, archived, backed up and retrieved, IPUs can dramatically improve data processing efficiency, data transfer speeds, data security and overall data management. By offloading specific data-related tasks from the main CPU, IPUs allow storage targets to operate more efficiently and effectively using integrated accelerators and advanced network technology. This is especially beneficial in scenarios where data performance, security and responsiveness are critical. The increased efficiency and effectiveness brought by IPUs also yields improved TCO, especially when taking a headless storage approach since it involves more simplified hardware and lower maintenance.


IPUs in a storage target use case

With the intent of disaggregated storage to separate storage resources from compute resources, this makes IPUs an optimized solution for disaggregated storage within a cloud data center’s infrastructure. They can accelerate data processing, storage and security as well as allocate storage resources dynamically.

Storage initiators are used to access and retrieve data from storage devices, such as disk arrays or storage servers. IPUs can play a role in enhancing the performance, efficiency and security of storage initiator use cases. They can accelerate data transferring, support advanced data services and enhance data duplication and compression tasks efficiently.

Telco edge

Telco edge computing is relevant for 5G networks and the growing demand for applications and services that require real-time processing and low-latency communication. Telco edge is evolving from closed, proprietary hardware and software to deploying more open, multi-vendor 5G network infrastructure with the intent to use general purpose compute for managing network infrastructure.


  IPUs in a telco edge use case

IPUs at the edge of a telecommunications network can make this intent more of a reality by enabling low-latency, high-performance and distributed computing capabilities. Consolidation of site network equipment into an IPU reduces capital expenditures and creates a solution with a smaller footprint, which results in improved TCO. By providing secure isolation and traffic separation of tenant applications, IPUs can support real-time applications, optimize network performance, enhance security and increase efficiency.

AI/ML as a service

Artificial Intelligence/Machine Learning (AI/ML) as a service delivers AI/ML capabilities as cloud-based, on-demand services. This contrasts with users and organizations building and maintaining their own AI and ML infrastructure, and instead, they leverage these services through a cloud service provider’s platform.

IPUs can play a significant role in delivering AI/ML as a service by enhancing performance, scalability, energy efficiency and security. They can accelerate AI/ML model inferencing and training of machine learning models by offloading inference tasks and training algorithms. With IPUs, AI/ML workloads can be processed in parallel, allowing for increased scalability. They can enable the deployment of AI/ML workloads in various scenarios, from cloud-based services to edge devices, ultimately providing users with faster and more efficient AI/ML capabilities while reducing the operational overhead for service providers.

Bare metal cloud

Bare metal cloud provides users with access to dedicated physical servers. Compared to traditional cloud computing, bare metal cloud offers users exclusive access to the physical server hardware unlike traditional cloud computing where users typically work with virtualized servers and shared resources. Bare metal cloud use cases offer the best of both worlds by providing a cloud-based management interface to access the bare metal servers.

IPUs can be an invaluable component for bare metal cloud environments, providing users with the ability to harness specialized hardware acceleration for their workloads while enjoying the benefits of bare metal server performance, flexibility and openness. This integration allows users to achieve optimal performance and efficiency for data-intensive tasks without the overhead of virtualization.


Security is a crucial use case for IPUs, given their ability to accelerate and enhance various security-related operations using the advanced crypto and compression engines built into the IPU. As IPUs are designed to offload infrastructure tasks from the CPU, these tasks can specifically be security-related workloads.

IPUs play a pivotal role in securing data, networks and computing environments by taking on resource-intensive security tasks, which can result in efficient and effective security measures. These tasks can include encryption and decryption, secure key management and security analytics and threat detection. Offloading security tasks from the CPU onto the IPU and utilizing its advanced crypto and compression engines allow these tasks to be performed at wire speed.

More to discover with new technological innovations

With new technological innovations, there’s always more to discover with how the technology can be used to optimize operations for better performance, efficiency and overall TCO. Our vision at Intel is to make IPUs easy to adopt across new, emerging use cases from the data center to the Edge. Perhaps the listed use cases are ones you may not have thought of previously or even better, it may spark new ideas of additional use cases that would benefit your own needs.


Notices and Disclaimers:

Intel technologies may require enabled hardware, software, or service activation.

No product or component can be absolutely secure.

Your costs and results may vary.

Intel does not control or audit third-party data. You should consult other sources to evaluate accuracy.

© Intel Corporation. Intel, the Intel logo and other Intel marks are trademarks of Intel Corporation or its subsidiaries. Other names and brands may be claimed as the property of others.

About the Author
Kristie Mann is VP/GM for the IPU Business Unit in Intel’s Network and Edge Group. She focuses on developing the ecosystem for networking infrastructure and storage offload products. She was previously VP Product Management for Optane Products in Intel’s Datacenter and AI Group. Before joining Intel, she was Sr Manager R&D Hardware at HPE, where she was responsible for delivering mechanical hardware and thermal systems development for workstation lines that produced $1.2B of revenue annually. She earned a BSME at Georgia Tech and an MBA at Duke University. She has been a presenter at Tech Field Day, Global Stac Live and SmartNICs Summit and has been interviewed or quoted in many leading technology media such as Blocks & Files, Futurum Research and InfoWorld.