LLM interfaces are the connection layer between the AI Fusion framework and Large Language Model (LLM) providers. This article explains how to configure and use LLM interfaces effectively.
The AI Fusion implementation invokes LLMs at multiple points in the agent workflow to execute specific tasks:
When an LLM task is triggered, Fabric looks up the configured LLM interface and uses it to communicate with the underlying model, using the credentials defined in that interface.
LLM interfaces are installed via K2exchange. Some of the foundation model providers function as infrastructure platforms and host also models that they do not own. The following are a few examples:
Note: As with any installed extension added to your project, you should add its files into your project's Git repository.
You can create multiple LLM interfaces for different purposes:
To use a specific interface in your flow:
sql-generator)interface parameterllm://[tag] (e.g., llm://sql-generator)LLM interfaces are the connection layer between the AI Fusion framework and Large Language Model (LLM) providers. This article explains how to configure and use LLM interfaces effectively.
The AI Fusion implementation invokes LLMs at multiple points in the agent workflow to execute specific tasks:
When an LLM task is triggered, Fabric looks up the configured LLM interface and uses it to communicate with the underlying model, using the credentials defined in that interface.
LLM interfaces are installed via K2exchange. Some of the foundation model providers function as infrastructure platforms and host also models that they do not own. The following are a few examples:
Note: As with any installed extension added to your project, you should add its files into your project's Git repository.
You can create multiple LLM interfaces for different purposes:
To use a specific interface in your flow:
sql-generator)interface parameterllm://[tag] (e.g., llm://sql-generator)