Microsoft Aims to Deliver 'Confidential AI' with Hardware Partners
Microsoft provided an update on its ongoing efforts to deliver "confidential AI" to Azure customers through its hardware partnerships in a recent blog post.
"In Microsoft Azure, we are continually innovating to enhance security," Mark Russinovich, CTO and Technical Fellow in the Microsoft Azure group, wrote. "One such pioneering effort is our collaboration with our hardware partners to create a new foundation based on silicon, that enables new levels of data protection through the protection of data in memory using confidential computing."
Through confidential computing, Microsoft aims to protect data in the third stage of its lifecycle, Russinovich explained.
"Data exists in three stages in its lifecycle: in use (when it is created and computed upon), at rest (when stored), and in transit (when moved). Customers today already take measures to protect their data at rest and in transit with existing encryption technologies," he wrote. "However, they have not had the means to protect their data in use at scale. Confidential computing is the missing third stage in protecting data when in use via hardware-based trusted execution environments (TEEs) that can now provide assurance that the data is protected during its entire lifecycle."
Microsoft has already enabled confidential computing on Azure central processing units (CPUs) and virtual machines (VMs). Now, it's addressing GPUs used with Azure services, which increasingly may get used for the processing of data through AI models, known as "inferencing."
The confidential AI collaboration with Nvidia is currently at the preview stage in Azure using VMs based on Nvidia H100-PCIe Tensor Core GPUs, Russinovich explained.
"Azure has been working closely with NVIDIA for several years to bring confidential to GPUs. And this is why, at Microsoft Ignite 2023, we announced Azure confidential VMs with NVIDIA H100-PCIe Tensor Core GPUs in preview. These Virtual Machines, along with the increasing number of Azure confidential computing (ACC) services, will allow more innovations that use sensitive and restricted data in the public cloud."
Organizations will get security assurances on the intellectual property of their AI models with this effort, and they will be able to collaborate with other parties "without ever exposing their models or data," the announcement suggested. The weights used by AI models won't be visible.
"Confidential AI can enhance the security and privacy of AI inferencing by allowing data and models to be processed in an encrypted state, preventing unauthorized access or leakage of sensitive information," Russinovich wrote.
Bringing confidential computing to GPUs is yet another step in Microsoft's confidential computing efforts, which also encompass, CPUs, virtual machines, and containers. Microsoft's aim is to enable confidential computing across Azure.
"Eventually confidential computing will become the norm, with pervasive memory encryption across Azure’s infrastructure, enabling organizations to verify data protection in the cloud throughout the entire data lifecycle," Russinovich wrote.
Microsoft also announced a bunch of other confidential computing advancements at Ignite, including confidential containers on the Azure Kubernetes Service (in preview). There's also a preview of an Azure Managed Confidential Consortium Framework for building and hosting decentralized applications, "where nodes executing the transactions cannot access the contents," which is used to limit information sharing between multiple parties.
Microsoft also announced a coming December preview of DCesv5 and ECesv5-series Azure confidential VMs using gen-4 Intel Xeon processors. These confidential VMs, with support for "up to 128 vCPUs," will permit Azure customers to "migrate their most sensitive workloads to Azure with minimal performance impact and without code changes," Microsoft indicated.
Kurt Mackie is senior news producer for 1105 Media's Converge360 group.