How to use on-premise Ollama with AI Connector
Using your own Ollama installation to perform operations with AI.
Configuring the AI Connector with your own Ollama installation can optimize your AI Process's performance and security. This guide provides detailed instructions on setting up the AI Connector, including necessary configurations for Model, Service URL, and authentication.
Prerequisites
Before Ollama can be used to perform the AI operations, you need to set the service up on a server, and configure it so that it is accessible by the Frends Agents you have in use.
When using on-premise Frends Agents with an on-premise Ollama service, you only need to make sure your internal network setup allows connectivity between the local machines.
By default, Ollama service runs on port 11434, which needs to be open in the firewall to access the service.
If you plan on using the on-premise Ollama service from Frends Cloud Agents, the service should be accessible from the public internet, or have incoming connections enabled from the Frends Cloud for the server it is running on. Note that by default, Ollama does not support any authentication methods, so using it over the internet poses a serious security risk.
Configuring the AI Connector for Ollama
In order to use AI Connector with the Ollama service, no preconfiguration is necessary in Frends. Simply add the AI Connector shape to your Process and access the Configuration tab to enter the connection details. Ensure the service type is set to "Ollama" to enable the correct parameters.
Setting the AI Model and Service URL
Next, specify the AI Model name within Ollama, such as "llama3:8b", to identify which model the AI will utilize. Input the Service URL, directing it to your designated Ollama endpoint, formatted as http://host-server-name:port/api/generate
.

Authentication and Security
Ollama service does not support authentication or security features by itself due targeting local only usage, and thus securing it using a reverse proxy or similar may be necessary if it would be used over the internet.
For example nginx software can be used for this, allowing only HTTPS connections to the Ollama service and only if the Service URL contains correct API key as a query parameter.
Because the AI Connector does not support additional headers or fields for Ollama, authentication methods utilising the Service URL are the only available choice at this time. As such, it may not be secure enough to be used over the public internet, even if an API key query parameter could be used.
Optional Tuning
For further optimization, you may adjust settings for Temperature and TopP to refine the AI's response variability and creativity. Additionally, configure logging and promote result settings to align with your operational needs.
Using the AI Connector with Ollama
Once the service configuration has been done and tuning options set, you can start using the Connector by providing the user and system prompts as necessary. Instead of using the Frends AI services, the shape will now connect to your Ollama service instead for handling the AI queries.
Last updated
Was this helpful?