Security
Determine security ramifications to protect personal data and information
106 Discussions

Securing AI - and Other Leading Use Cases - with Confidential Computing

IPAS_Security
Employee
0 0 11.1K

By Anil Rao, VP & GM, Systems Architecture & Engineering, Office of the CTO

Artificial Intelligence is all around us, and will continue to grow as compute becomes ubiquitous, connectivity becomes pervasive, and the cloud to edge infrastructure is built out. With so much private data and confidential intellectual property moving across shared networks and being processed on public cloud infrastructure, it is imperative that it remain secure, whatever its state and wherever it is.

While encryption of data at rest and data in transit is well established and understood by the industry, data must be unencrypted while it is being processed, and protecting that data during processing while maintaining high performance has been an industry-wide challenge. Confidential Computing arose as a solution to that challenge and is rapidly being adopted by both application developers and cloud service providers.

There are three primary use cases for Confidential Computing—though more are likely to emerge. The transition from processing in the clear to processing in a hardware-based Trusted Execution Environment is akin to the transition industry made years ago from an open Internet to the Internet with encryption (Secure Socket Layer, or SSL). Interest in conducting business and personal interactions online grew exponentially, commensurate with increased trust of the underlying compute infrastructure.

Emerging Use Cases:

Single Party Workloads: For companies who long resisted moving extremely sensitive data or applications to a shared cloud, Confidential Computing offers increased assurance that no one—not even the cloud administrator—has access to the data while it is being processed. This is another layer of protection which both enterprises and CSP’s are embracing.

Multi-Party Collaboration: As we have seen demonstrated clearly with COVID, people and organizations can benefit tremendously from sharing information as long as they can protect the privacy of their customers and their own IP. Confidential Computing enables partners or competitors to collaborate. Consider a case where traditional competitors, such as banks, have an interest in working together to detect money laundering by submitting each of their data sets to build a collective machine learning model from which they can all benefit.

Edge to Cloud: As discussed above, Confidential Computing will enable protection across the entire data chain, from the edge to the cloud. Frequently, AI at the edge is focused on inferencing, whereas machine learning models typically run in compute intensive environments such as public clouds. This means protecting inferencing engines that operate at the edge from malicious software that may be present on the same device or accessing shared memory, as well as protecting the central learning models themselves from attack while they are running in the cloud.

Conclusion:

Evolving demands for security and privacy are again transforming compute. Confidential Computing will be a paradigm shift that enables further growth of services that run on shared infrastructure as well as new use cases that depend upon collaboration among ecosystem partners.

About the Author
Intel Product Assurance and Security (IPAS) is designed to serve as a security center of excellence – a sort of mission control – that looks across all of Intel. Beyond addressing the security issues of today, we are looking longer-term at the evolving threat landscape and continuously improving product security in the years ahead.