New OpenAI Service Provides GPT-3 Access Via Microsoft Azure
- By John K. Waters
OpenAI's ground-breaking GPT-3 natural language (NL) machine learning models will now be available through a new service provided on Microsoft's Azure cloud platform. The new service, called the Azure OpenAI Service, uses Azure Cognitive Services to access OpenAPI's powerful GPT-3 natural language (NL) models. Available initially by invitation only, the service will include features for security, compliance, and data privacy that are built into the Azure platform.
Redmond unveiled the new service at its recent Ignite conference.
Eric Boyd, Microsoft corporate vice president for Azure AI, emphasized in a statement that Microsoft was "…just in the beginning stages of figuring out what the power and potential of GPT-3 is."
"Now we are taking what OpenAI has released and making it available with all the enterprise promises that businesses need to move into production," Boyd said.
Microsoft also plans to offer new tools to its Azure OpenAI Service customers to help ensure that outputs the model returns are "appropriate for their businesses," and it will monitor how people are employing the technology "to help ensure it’s being used for its intended purposes."
GPT-3 is the largest language model in the world, comprising a massive 175 billion training parameters. The model is pre-trained on the Common Crawl data set, a corpus of almost a trillion words scraped from the Web. It's also trained on Azure’s AI supercomputer, which Microsoft built in collaboration with OpenAI.
OpenAI was originally founded as a non-profit open-source organization by a group of investors that included Tesla founder Elon Musk. Today it comprises two entities: the non-profit OpenAI Inc. and the for-profit OpenAI LP. Microsoft, which is OpenAI's cloud services provider, invested $1 billion in the company last year.
The OpenAI API is the organization's first commercial product, and it claims that more than 300 applications are now using GPT-3 via the API, and tens of thousands of developers worldwide are building on the platform. "We currently generate an average of 4.5 billion words per day, and continue to scale production traffic," OpenAI says on its website.
"The potential enterprise uses for GPT-3 range from summarizing common complaints in customer service logs to helping developers code faster without having to stop and search for examples or generating new content as starting points for blog posts, explained Dominic Divakaruni, Microsoft group product manager leading Azure OpenAI.
Microsoft has already tapped into OpenAI's Codex system to power GitHub Copilot, its "AI pair programmer," announced last month. The Codex system translates natural language into code. And it announced plans to integrate GPT-3 with its PowerApps product suite at its annual Build developer conference in June.
"GPT-3 has really proven itself as the first powerful, general-purpose model for natural language," said OpenAI CEO Sam Altman, in a statement. "It’s one model you can use for all these things, which developers love, because you can try things very easily. For a while now, we’ve wanted to figure out a way to scale it as broadly as possible, which is part of the thing that really excites us about the partnership with Microsoft."
Although GPT-3 has been publicly available since last year through an API managed by OpenAI, some potential users have requested additional layers of security, access management, private networking, data handling protections or scaling capacity, which Microsoft says it will provide in the Azure OpenAI Service.
"It really is a new paradigm where this very large model is now itself the platform" Boyd added. "So, companies can just use it and give it a couple of examples and get the results they need without needing a whole data science team and thousands of GPUs and all the resources to train the model. I think that’s why we see the huge amount of interest around businesses wanting to use GPT-3; it’s both very powerful and very simple."
The Azure OpenAI Service is the latest offering to emerge from a partnership between Microsoft and OpenAI that "aims to accelerate breakthroughs in AI," the two companies said when it announced in May of this year. The plan is to jointly develop the first supercomputer on Azure to commercialize new AI technologies.
John K. Waters is the editor in chief of a number of Converge360.com sites, with a focus on high-end development, AI and future tech. He's been writing about cutting-edge technologies and culture of Silicon Valley for more than two decades, and he's written more than a dozen books. He also co-scripted the documentary film Silicon Valley: A 100 Year Renaissance, which aired on PBS. He can be reached at firstname.lastname@example.org.