The Chat GPT buzz continues to grow, as Microsoft has announced the general availability of its Azure OpenAI Service, an offering related to its $1 billion investment in OpenAI, the maker of ChatGPT. That means more businesses can apply to use OpenAI’s Azure-hosted and trained large language models (LLMs), such as GPT-3.5, the LLMs behind ChatGPT; DALL•E 2, the AI that generates images from casual descriptions; and Codex, the GPT-3-based foundation of GitHub’s Copilot AI paired programming service.
The company has also indicated that it will add ChatGPT itself to Azure “soon.”
Microsoft uses its Azure OpenAI service to power GitHub Copilot, the $10 per month service that helps suggest lines of code to developers inside their code editor.
And let’s look at the MIT Technology Review, which has an article on how Microsoft might leverage more OpenAI products. How might that play out? Search is the obvious first answer, then infusions of AI to products like Outlook and Office. Quote: “language models could be integrated into Word to make it easier for people to summarize reports, write proposals, or generate ideas, Shah says. They could also give email programs and Word better autocomplete tools, he adds. And it’s not just all word-based. Microsoft has already said it will use OpenAI’s text-to-image generator DALL-E to create images for PowerPoint presentations too.”
Why do we care?
Even if providers never sell ChatGPT or an AI directly, Microsoft’s moves should give pause. Training on how to be effective with these tools and how to use them in a way that not only impacts the business but also retains the core value of the IP of the organization is going to be the name of the game. Providers should be considering these implications now because, with the speed of product development, they’ll be widespread faster than you can spell A I.