How to use Azure AI Inference service with AI Connector
Using your own Azure AI Inference service and API.
Configuring the AI Connector with your own Azure AI Inference service can optimize your AI Process's performance and security. This guide provides detailed instructions on setting up the AI Connector, including necessary configurations for Model, Service URL, and authentication.
Prerequisites
Using this option requires you to have an active subscription to Azure AI services with the AI Model Inference API enabled.
You also need to have enabled API key authentication for the service.
You can learn more about the Azure AI Model Inference API from here.
Configuring the AI Connector for Azure AI Inference
In order to use AI Connector with the Azure AI Inference service, no preconfiguration is necessary in Frends. Simply add the AI Connector shape to your Process and access the Configuration tab to enter the connection details. Ensure the service type is set to "Azure Inference" to enable the correct parameters.
Setting the AI Model and Service URL
Next, specify the AI Model name within Azure AI Inference, such as "gpt-4o", to identify which model the AI will utilize. Input the Service URL, directing it to your Azure AI Inference endpoint, formatted as https://<your-endpoint>.inference.ai.azure.com/v1/chat/completions
.

Authentication
Authentication to Azure AI Inference API is done by using an API key. Obtain an API key for your Azure AI service and input it here.
As a general recommendation, authentication values and values such as the service URL that do not change all the time should be saved as Environment Variables, as shown in the example, instead of directly inputting them as hard-coded values.
Optional Tuning
For further optimization, you may adjust settings for Temperature and TopP to refine the AI's response variability and creativity.
Alternatively and in addition, you can enable Use custom options to provide all the necessary configuration options as JSON.
Finally, you can configure logging and promote result settings to align with your operational needs.
Using the AI Connector with Azure AI Inference
Once the service configuration has been done and tuning options set, you can start using the Connector by providing the user and system prompts as necessary. Instead of using the Frends AI services, the shape will now connect to your Azure AI services instead for handling the AI queries.
Last updated
Was this helpful?