News
Microsoft Expands Meta AI Collaboration on Azure
Artificial intelligence researchers in Meta's AI group are set to begin using the latest Microsoft Azure virtual machines (VMs) for their large-scale AI research workloads, the two companies announced.
Meta AI researchers have selected the Azure VM series NDm A100 v4, which offers the supercomputing power of Nvidia A100 80GB GPUs, to boost performance and scale for their AI research. This series was designed to provide the flexibility to configure clusters of any size automatically and dynamically from a few GPUs to thousands.
Meta has already used the NDm A100 v4 series of Azure VMs to train its OPT-175B language model.
The two companies are already collaborating on AI research. The Meta AI-fostered PyTorch open-source library for deep learning research, for example, uses GPUs for tensor computing and arrays to train AI models on neural networks, which are modeled on human decision making. And Microsoft is planning to bolster the use of PyTorch on Azure by building new PyTorch development accelerators, which will be arriving in the coming months, the company says. Microsoft also pledged to continue its support program for PyTorch developers, which is known as PyTorch Enterprise.
Microsoft developers use PyTorch Enterprise, a version of the popular open-source deep learning framework, on the Bing search engine and Azure Cognitive Services. It also contributes to various PyTorch open-source projects, including PyTorch Profiler, ONNX Runtime, and Deep Speed, among others.
Meta is using its AI research, in part, to support what it calls "immersive" ad campaigns, built using its Spark AR augmented reality toolkit. With these ad campaigns, a user scans a QR code with their mobile phone and then interacts with augmented reality scenes—from shopping apps that let you try on glasses or see what a sofa would look like in your living room to "experiences," such as swimming with sharks or visiting King Tut's tomb.
Meta's collaboration with Microsoft on the Azure VM series will benefit "more developers around the world," said Jerome Pesenti, vice president of AI at Meta, in a statement.
"With Azure's compute power and 1.6 TB/s of interconnect bandwidth per VM we are able to accelerate our ever-growing training demands to better accommodate larger and more innovative AI models," Pesenti said.
About the Author
Kurt Mackie is senior news producer for 1105 Media's Converge360 group.