Azure OpenAI's ChatGPT Integration Sees Daylight
Microsoft's Azure OpenAI service already has support for Dall-E 2, GPT-3.5 and Codex. On Thursday, Microsoft previewed support for the next OpenAI tool: the ChatGPT chat engine.
Microsoft's goal in bringing ChatGPT to Azure OpenAI is to help organizations bolster their chatbots for use with customer service issues, among other use cases. Here's Microsoft's characterization of those use cases:
Now with ChatGPT in preview in Azure OpenAI Service, developers can integrate custom AI-powered experiences directly into their own applications, including enhancing existing bots to handle unexpected questions, recapping call center conversations to enable faster customer support resolutions, creating new ad copy with personalized offers, automating claims processing, and more.
Microsoft is also upselling the use of its Azure Cognitive Search service with these ChatGPT-enhanced chatbots. Integration with Azure Cognitive Search will let organizations connect the chatbot with specific company data, avoiding generic responses. The Azure Cognitive Search service acts as "an external knowledge base that can retrieve pieces quickly and with good relevance," Microsoft explained, in this announcement.
Microsoft touted its Azure OpenAI Studio as a development environment that supports creating "no-code" intelligent apps. It can also be used to "customize ChatGPT."
Microsoft already has testimonials on the use of the ChatGPT preview in the Azure OpenAI service. It's being used by The Office Depot as an HR chatbot that generates "new job descriptions," as well as "enhancing associate communication." Icertis is using the preview with contract data to create "an intelligent assistant that surfaces and unlocks insights throughout the contract lifecycle." The government of Singapore is using the preview to deliver better public-sector services, per the announcement.
Azure OpenAI and Other AI
In general, Microsoft has been using "the power of large language models from OpenAI and the AI-optimized infrastructure of Azure" to support its various business and consumer products.
For instance, GitHub Copilot, which offers pair programmer support, uses the Azure OpenAI service. Other products listed by Microsoft's announcement as using some form of AI include Microsoft Bing for search, Microsoft Viva Sales for bolstering sales efforts and the Microsoft Teams Premium collaboration service.
The Azure OpenAI service can be used in various scenarios, according to Microsoft. It can be used for content generation in response to customer queries, or for user interface personalizations on Web sites. It can be used to summarize content in customer-support scenarios or to summarize trends from social media. The Azure OpenAI service also can generate code from natural language prompts, including SQL queries and code documentation. The service also bolsters search with its "knowledge mining" capabilities.
Azure OpenAI Costs
The ChatGPT preview in the Azure OpenAI service currently can be tried, but Microsoft maybe charges for some aspects, even as a preview.
"Customers can begin using ChatGPT today," the announcement stated. "It is priced at $0.002/1k tokens and billing for all ChatGPT usage begins March 13th."
On top of the use rate for models (which is called "inferencing," and billed based on "tokens"), Azure OpenAI has hosting and training costs. Pricing details are shown at this page.
The inferencing per 1,000 tokens model use rate varies, depending on the AI model that's been selected. For instance, it's the Curie model that's priced at "$0.002/1k tokens."
The pricing page doesn't go into much detail on these matters beyond describing the Azure OpenAI service as having a "pay as you go" type of pricing structure. However, Microsoft's pricing page does include a chat popup box to get help, which shows a picture of a smiling human.
Kurt Mackie is senior news producer for 1105 Media's Converge360 group.