Generated with sparks and insights from 6 sources
Introduction
-
Ollama supports tool calling with popular models such as Llama 3.1, Mistral Nemo, Firefunction v2, and Command-R +.
-
Tool calling allows models to perform complex tasks or interact with the outside world by using functions, APIs, web browsing, and code interpreters.
-
To enable tool calling, users provide a list of available tools via the
tools
field in Ollama’s API. -
Supported models will answer with a
tool_calls
response, and tool responses can be provided via messages with thetool
role. -
Ollama’s OpenAI compatible endpoint also supports tools, making it possible to switch to using Llama 3.1 and other models.
Supported Models [1]
-
Llama 3.1: A popular model supported by Ollama for tool calling.
-
Mistral Nemo: Another model that can utilize tool calling in Ollama.
-
Firefunction v2: Supported for tool calling in Ollama.
-
Command-R +: Included in the list of models that support tool calling.
-
Check for latest models: Users should ensure they have the latest model by running
ollama pull <model>
.
Tool Calling Mechanism [1]
-
Tool calling is enabled by providing a list of available tools via the
tools
field in Ollama’s API. -
Example tools include functions, APIs, web browsing, and code interpreters.
-
Supported models will answer with a
tool_calls
response. -
Tool responses can be provided via messages with the
tool
role. -
Detailed documentation is available in the API documentation.
[Integration with Spring AI](/spark?generatorapi=generate_by_article_name&generatorapi_param=query=Ollama+Spring+AI+integration) [2]
-
Spring AI has integrated Ollama's function calling capabilities into the Spring ecosystem.
-
Java developers can easily leverage this functionality in their applications.
-
Key features include easy integration, flexible configuration, and automatic JSON schema generation.
-
Supports multiple functions in a single chat session and runtime function selection.
-
Ollama is OpenAI API compatible, allowing use with Spring AI OpenAI client.
Advanced Tool Management [3]
-
Centralized Tool Registry: Define and export all tool configurations in a module.
-
Decorator-based Tool Registration: Use decorators to register tools automatically.
-
Configuration-driven Approach: Define tools in a configuration file (e.g., YAML).
-
Combining Approaches: Use a
[ToolManager](prompt://ask_markdown?question=ToolManager)
class to combine different techniques. -
Benefits: Modularity, scalability, maintainability, flexibility, and readability.
Local Model Tool Calling [4]
-
Ollama supports tool calls with local models starting from version 0.3.0.
-
Example: Building a weather report agent using AutoGen.Net and Ollama.
-
Create a .NET console application and add necessary AutoGen.Net packages.
-
Define a dummy tool class to fetch weather reports.
-
Connect to Ollama using OpenAIChatAgent and CustomHttpClientHandler.
Ad-Hoc Tool Calling [5]
-
Ad-hoc tool calling can be added to a chat model using a prompting strategy.
-
Install necessary packages like langchain and langchain-community.
-
Create custom tools and define their arguments and descriptions.
-
Write a prompt that specifies the tools, arguments, and desired output format.
-
Use JsonOutputParser to parse the model's output to JSON.
Related Videos
<br><br>
<div class="-md-ext-youtube-widget"> { "title": "Function Calling with Local Models & LangChain - Ollama ...", "link": "https://www.youtube.com/watch?v=Ss_GdU0KqE0", "channel": { "name": ""}, "published_date": "May 8, 2024", "length": "" }</div>
<div class="-md-ext-youtube-widget"> { "title": "Function Calling in Ollama vs OpenAI", "link": "https://www.youtube.com/watch?v=RXDWkiuXtG0", "channel": { "name": ""}, "published_date": "Feb 13, 2024", "length": "" }</div>
<div class="-md-ext-youtube-widget"> { "title": "Ollama Function Calling Advanced: Make your Application ...", "link": "https://www.youtube.com/watch?v=eHfMCtlsb1o", "channel": { "name": ""}, "published_date": "Feb 13, 2024", "length": "" }</div>