Azure OpenAI Integration Setup

Azure OpenAI Integration Setup

  1. Navigate to your Microsoft Azure Portal.

  2. Click the βž• button to Create a Resource.

  1. Search for Azure OpenAI.

  2. Choose Azure OpenAI.

  1. Request access to the Azure OpenAI service.

You must fill out a form to request access to the Azure OpenAI service

  1. Wait for the Welcome Email.

Note that activation of the service can take up to 48 hours, based on Microsoft's timelines (so who knows, somewhere between 3 minutes and 7 days maybe?)

  1. Return to the Azure Portal upon receiving the approval email.

  2. Create the OpenAI Service in your Azure subscription.

  1. Navigate to Model deployments.

  1. Create a new Deployment.

  1. Choose gpt-35-turbo model.

  2. Name the deployment the same. This will keep things easier to remember.

  1. Return to the OpenAI Resource in the Azure Portal.

  2. Navigate to Keys and Endpoint.

  3. Copy a Key to your clipboard

  4. Collect the region & endpoint. You'll need it to craft for your Base URL in Rewst.

  1. Return to Rewst.

  2. Navigate to Integrations.

  3. Choose OpenAI.

Configuring Azure OpenAI with Existing OpenAI Integrations

If you already have an OpenAI integration configured, you can add an instance for Azure.

  1. Click the drop-down β†’ Add Configuration.

  1. Name the configuration (example: "Azure" ).

  2. Confirm your Base URL reflects the accurate resource and region collected from Azure. Incorrect URLs will prevent Rewst from accessing the OpenAI service.

  3. Add the Azure API Version.

Crafting the Base URL

The Base URL is what directs Rewst to communicate with the correct Azure OpenAI instance. Here’s how to construct it:

Anatomy of the Base URL: The standard structure of the Base URL provided by Azure typically looks like this: https://<resource-name>.openai.azure.com/openai/deployments/<deployment-name>

  • If you are using GPT 3.5 Turbo, for example, you will append it to the URL like so: https://<resource-name>.openai.azure.com/openai/deployments/gpt-35-turbo

Customize Your Base URL: Depending on your service instance or specific requirements, you may need to alter the base URL. For example:

  • If you have a custom deployment named custom-model, you might append it to the URL like so: https://<resource-name>.openai.azure.com/openai/deployments/<custom-model>

  • Ensure you replace <resource-name>, <deployment-name>, and/or <custom-model> with the actual information from your Azure OpenAI service.

You must complete the Azure API Version field, otherwise, the integration will fail.

The validation popup may show a failure. This can be ignored. OpenAI validation URLs are currently used and will be updated in a future release.

Configuring Workflow Triggers to Use Azure OpenAI

If this is your second instance of OpenAI (you configured the direct OpenAI integration before) then you may adjust your workflow triggers to choose which version of the integration you want to use in your workflows.

  1. Navigate to Integration Overrides.

  2. Click Add (the βž• button).

  3. Choose OpenAI for Integration.

  4. Choose between your OpenAI integration instances to use in Integration Configuration.

Last updated