Microsoft Licenses OpenAI's GPT-3 Language Model
- By John K. Waters
Microsoft is expanding its ongoing collaboration with artificial intelligence (AI) company OpenAI to include an exclusive license for its GPT-3 neural-network-powered language model, the two company's announced this week. Microsoft plans to leverage that technology "to develop and deliver advanced AI solutions for our customers, as well as create new solutions that harness the amazing power of advanced natural language generation," the company said.
OpenAI will continue to offer GPT-3 and other models via its own Azure-hosted API, which was launched in June, the companies said, while Microsoft adds the capabilities of GPT-3 to its own products and services.
GPT-3 is the largest language model in the world, comprising a massive 175 billion training parameters. GPT-3 is pre-trained on the Common Crawl data set, a corpus of almost a trillion words scraped from the Web. And it's trained on Azure's AI supercomputer, which Microsoft built in collaboration with OpenAI.
OpenAI was originally founded as a non-profit open-source organization by a group of investors that included Tesla founder Elon Musk. Today it comprises two entities: the non-profit OpenAI Inc. and the for-profit OpenAI LP. Microsoft, which is OpenAI's cloud services provider, invested $1 billion in the company last year.
Kevin Scott, Microsoft's CTO and executive vice president of the company's Technology and Research group, announced the GPT-3 licensing agreement in a blog post.
"We see this as an incredible opportunity to expand our Azure-powered AI platform in a way that democratizes AI technology; enables new products, services, and experiences; and increases the positive impact of AI at Scale," Scott said. "Our mission at Microsoft is to empower every person and every organization on the planet to achieve more, so we want to make sure that this AI platform is available to everyone – researchers, entrepreneurs, hobbyists, businesses – to empower their ambitions to create something new and interesting."
Microsoft's AI at Scale initiative is about combining the power of large-scale AI models and supercomputing "to fuel next generation AI capabilities at scale," the website states, "including natural language innovation across Microsoft 365 apps, services, and experiences."
GPT-3 made a splash last summer when samples of text it generated began circulating via social media. Given access through an API, a select group of beta testers demonstrated GPT-3's ability to write everything from articles and poems to working computer code and guitar tablature. Unlike other models, such as Google's BERT (Bidirectional Encoder Representations from Transformers), which require extensive fine-tuning with thousands of examples, GPT-3 can perform specific tasks without special tuning. GPT-3 can churn out the work of a poet or a programmer with fewer than 10 training examples.
"The scope of commercial and creative potential that can be unlocked through the GPT-3 model is profound, with genuinely novel capabilities – most of which we haven't even imagined yet," Scott said. "Directly aiding human creativity and ingenuity in areas like writing and composition, describing and summarizing large blocks of long-form data (including code), converting natural language to another language – the possibilities are limited only by the ideas and scenarios that we bring to the table."
John K. Waters is the editor in chief of a number of Converge360.com sites, with a focus on high-end development, AI and future tech. He's been writing about cutting-edge technologies and culture of Silicon Valley for more than two decades, and he's written more than a dozen books. He also co-scripted the documentary film Silicon Valley: A 100 Year Renaissance, which aired on PBS. He can be reached at email@example.com.