Since 2019, Microsoft has been investing in the California-based artificial intelligence company OpenAI. The goal of this partnership? To make OpenAI's AI models (such as GPT-4, ChatGPT, and Dall-E) accessible within the Microsoft 365 work environment. In 2023, this became a reality with the launch of Azure OpenAI Service, a solution that enables organizations to leverage OpenAI's models directly within the Azure portal to create AI applications tailored to their use cases. So, how can you benefit from this technology? Mozzaik provides an overview.
Azure and OpenAI Service: Definitions
OpenAI is a U.S.-based company specializing in artificial intelligence (AI). Initially founded as a non-profit organization, OpenAI is now a company 49% owned by Microsoft. The tech giant from Redmond has invested no less than $13 billion in OpenAI. OpenAI's main AI solutions include:
- Advanced language models GPT-3 and GPT-4
- ChatGPT, a conversational agent based on GPT language models
- Dall-E, an image generator
- Sora, a video generation model
- Codex, a code generator
OpenAI uses Microsoft Azure's infrastructure to host and train its AI models. But what is Azure Service? Microsoft Azure is a secure cloud platform developed by Microsoft. It includes over 200 cloud products and services that developers can use to create solutions tailored to each organization's needs. Microsoft Azure offers a wide range of services designed to meet various business requirements, including:
- Application development
- Cloud migration and modernization
- Data and analytics
- Hybrid cloud and infrastructure
- Internet of Things (IoT)
- Security and governance
- Artificial Intelligence (AI)
Azure OpenAI Service: AI Within Microsoft 365
Thanks to Azure OpenAI Service, businesses operating within the Microsoft 365 environment can now develop applications that integrate OpenAI's models, directly from the Azure portal using an API. This is a significant time-saver for developers and a security asset for organizations.
OpenAI Data: Integrate ChatGPT Into Your Microsoft 365 Environment
Azure OpenAI enables organizations to easily create customized AI assistants within their Microsoft 365 environment, leveraging their own data. The main advantage? Developers do not need to retrain or fine-tune OpenAI's generative AI models to work with internal data. The result is significant time savings and solutions tailored to employees' needs.
To ensure the generative AI application delivers even more precise responses to teams, programmers can define specific internal sources for the AI to reference when answering queries. This is achieved using the Azure AI Search service. This solution, based on Retrieval-Augmented Generation (RAG), allows the internal conversational agent to formulate responses solely based on information contained in specific documents (e.g., internal procedure manuals) or within a predefined corpus of documents.
Security: Data Processing by Azure OpenAI Services
Azure OpenAI ensures the security of its services and the internal data accessed by OpenAI's models. The data you wish to use through AI models is stored and processed by Microsoft via Azure OpenAI. It is not available to OpenAI or other Microsoft clients. Specifically, Azure OpenAI processes several types of data:
- Prompts submitted by users and content generated by services (text, images, etc.)
- Data uploaded by users via the File API or vector store for specific functionalities, such as model fine-tuning
- Message history and other content when users employ Azure OpenAI features like the Assistants API Threads
- Augmented data included in or via prompts, such as information retrieved from a URL present in the prompt itself
- Training data provided by users in the form of prompt-response pairs, enabling OpenAI models to be fine-tuned
Azure OpenAI uses this data to generate responses through its AI models without retaining prompts or outputs. However, data may be temporarily stored when certain features requiring storage are used (e.g., Threads in the Assistants API). Regardless, data processing is secure, leveraging mechanisms such as encryption. Data is also subject to a content filter to detect material that may violate usage terms.