Azure OpenAI integration setup
Last updated
Was this helpful?
Last updated
Was this helpful?
Navigate to your .
Click +Create a Resource.
Search for Azure OpenAI
.
Click the tile for Azure OpenAI.
Request access to the Azure OpenAI service.
Wait for the welcome email.
Note that activation of the service can take up to 48 hours, based on Microsoft's timelines.
Return to the Azure Portal upon receiving the approval email.
Create the OpenAI Service in your Azure subscription.
Navigate to Resource Management > Model deployments.
Click +Create a new Deployment.
Choose gpt-35-turbo model.
Name the deployment gpt-35-turbo model.
Return to the OpenAI Resource in the Azure Portal.
Navigate to Keys and Endpoint.
Copy a Key to your clipboard
Copy the region and endpoint and store them in a safe location. You'll need it to craft for your Base URL in Rewst.
If you already have an OpenAI integration configured, you can add an instance for Azure.
Navigate to the integration's configuration page by navigating to Configuration > Integrations > OpenAI.
Click Default > Add Configuration.
Name the configuration Azure
.
Confirm that your Base URL reflects the accurate resource and region collected from Azure. Incorrect URLs will prevent Rewst from accessing the OpenAI service.
Enter the Azure API Version.
The Base URL is what directs Rewst to communicate with the correct Azure OpenAI instance. Here’s how to construct it:
Anatomy of the Base URL: The standard structure of the Base URL provided by Azure typically looks like this: https://<resource-name>.openai.azure.com/openai/deployments/<deployment-name>
If you are using GPT 3.5 Turbo, for example, you will append it to the URL like so: https://<resource-name>.openai.azure.com/openai/deployments/gpt-35-turbo
Customize Your Base URL: Depending on your service instance or specific requirements, you may need to alter the base URL. For example:
If you have a custom deployment named custom-model
, you might append it to the URL like so: https://<resource-name>.openai.azure.com/openai/deployments/<custom-model>
Ensure you replace <resource-name>, <deployment-name>, and/or
<custom-model>
with the actual information from your Azure OpenAI service.
You must complete the Azure API Version field, otherwise, the integration will fail.
If this is your second instance of OpenAI, meaning you configured the direct OpenAI integration before, then you may adjust your workflow triggers to choose which version of the integration you want to use in your workflows.
Navigate to the integration's configuration page by navigating to Configuration > Integrations > OpenAI.
Find Integration Overrides under Trigger Configuration.
Choose OpenAI in the Integration drop-down selector.
Choose which of your your OpenAI integration instances to use in the Integration Configuration drop-down selector.
If you have not yet set up the OpenAI integration in Rewst, follow our integration setup steps in Rewst in our general .
Click .