# Intelligent AI Connector

AI is useful when you use it as a tool, but it becomes transformative when you integrate it into your workflows. By making AI an active participant in your processes, you can stop managing the task and let the system handle it entirely. In Frends, AI can be included in the Processes natively using Intelligent AI Connector, automating tasks that have so far required human input.&#x20;

## Intelligent AI Connector

Intelligent AI Connector is a new building block for Processes in Frends. It allows the AI to participate in your Processes by performing actions you have specified as prompts, in the location of the Process you need to.&#x20;

<figure><img src="/files/j5QCMgRt1zkDOQ12JNHI" alt="Picture showing the Native AI shape as part of a Frends Process."><figcaption><p>Intelligent AI Connector in action.</p></figcaption></figure>

Multiple prompts can be provided for the single shape, making sure the provided answer follows the required format and is repeatable. By default, Frends system prompt is given to the shape in order to make the AI work with Frends ideology and Processes specifically, instead of providing the answers in generic text or other unsuitable format. From there, it's up to the developer to specify the action taken by AI.

## Why Use AI Connector

BPMN 2.0 with AI Connector enables AI Orchestrations. Practically, you can implement the thought process of an employee as steps and decisions. Instead of requiring a human decision for categorizing an incoming message into tech or sales support cases, AI Connector with LLM models can read and categorize the message for you automatically. The Frends Process can then continue handling the case into correct destination. The whole process for multiple different ticket cases can be written as a series of prompts, decisions, and actions that combine the basic operations from Tasks and programming with input from AI.

## Choose your AI Model

The AI Connector in Frends is not tied to a single available model. While Frends provides its own AI services with a curated set of LLM Models to use for various use cases that work out of the box and no configuration, you can also provide your own Azure Inference service, or take a step towards owning your own data even further and use an on-premise Ollama installation to perform the AI actions.

## The Future

Future Frends features will expand the AI steps to do the API calls internally inside the AI step and even enable the Computer Use Access to finalize the Agentic AI path. These two steps will be included later on in our roadmap. With Intelligent AI Connector, you can already start your Agentic AI journey.

## Reference Documentation

In order to learn the specifics of how to use the AI Connector in your Processes, head over to our [reference documentation for AI Connector](/reference/shapes/activity-shapes/ai-connector.md), which will give details of the shape.


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.frends.com/frends-development/ai-features/intelligent-ai-connector.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
