Tools
While large language models (LLMs) have incredible generation capabilities, they struggle with discrete tasks (e.g. mathematics) and interacting with the outside world (e.g. getting the weather).
A 'Tool' is a general term for a function or logic that can be executed by an AI model. They can be user defined or pre-defined by the provider. Tools are provided to the AI model's context as per the developer's needs.The AI model can then use the provided tools to perform specific queries to aid in text generation or actions to be taken externally.
Introduction & Capability Safety
Not all models support tool calls. AISDK's Capability System ensures you only provides tools to models that explicitly support them. This check happens at compile time, preventing you from shipping broken agent logic.
Here is how the type system protects you:
// ✅ THIS WORKS: GPT-5 supports tool calls
let request = LanguageModelRequest::builder()
.model(OpenAI::gpt_5())
.with_tool(my_tool()) // Valid!
.build();// ❌ THIS FAILS TO COMPILE: Because O1 Mini doesn't support tool calls
let request = LanguageModelRequest::builder()
.model(OpenAI::o1_mini())
.with_tool(my_tool()) // ERROR: The trait `ToolCallSupport` is not implemented
.build();Defining Tools with #[tool]
The recommended way to define tools is using the #[tool] macro. This attribute transforms a standard Rust function into a Tool object that the model can understand.
use aisdk::core::Tool;
use aisdk::macros::tool;
#[tool]
/// Get the current weather in a specific location
pub fn get_weather(location: String) -> Tool {
// In a real app, this would call a weather API
let weather = format!("The weather in {} is 75°F and sunny.", location);
Ok(weather)
}Function-to-Tool Extraction
AISDK automatically extracts metadata from your function:
- Name: Unique, descriptive name of the tool. Her it is automatically inferred as
get_weatherfrom the function name. - Description: Briefly describes the tool's purpose and usage. Rust-style docs with examples can be used for few-shot prompting.
- Arguments: Function parameters are automatically converted into a JSON Schema using
schemars. - Body: The tool’s logic executed when called by the AI. Must return
ToolorResult<String, String>. Errors can be returned asErr(String)to help the AI understand failures.
Registering Tools
To make a tool available to a model, pass it to the .with_tool() method in the LanguageModelRequest builder.
When using the
#[tool]macro, the macro wraps your logic in a function that returns aTool. You must call the function (e.g.,get_weather()) when passing it to the builder.
This appends the tool to the list of tools that the AI model will use.
let response = LanguageModelRequest::builder()
.model(OpenAI::gpt_5())
.prompt("What's the weather in Tokyo?")
.with_tool(get_weather()) // Register the tool
.build()
.generate_text()
.await?;tool Macro Customization
You can override the default inferred name and description using macro attributes.
#[tool(name = "check-weather", desc = "Retrieves weather data for a city")]
fn get_weather(location: String) -> Tool {
Ok(format!("Sunny in {}", location))
}Manual Tool Definition (Advanced)
You can define your own tools in aisdk by instantiating the Tool and related structs and passing
it to one of the AI model text generation builders.
Defining the Tool
use super::{Tool, ToolExecute};
use serde_json::Value;
// define tool function body, should return Result<String, String>
#[allow(unused_variables)]
let func = ToolExecute::new(Box::new(|inp: Value| {
// Ai SDK will pass in a json object with the following structure
// ```json
// {
// "location": "New York"
// }
// ```
let location = inp.get("location").unwrap();
Ok(format!("Cloudy"))
}));
// define tool input structure
#[derive(schemars::JsonSchema, Debug)]
#[allow(dead_code)]
struct ToolInput {
location: String,
}
// change tool arguments to json schema
// Which will be similar to the following
// ```json
// "properties": {
// "location": {
// "type": "string"
// }
// }
let schema = schemars::schema_for!(ToolInput);
// bring it all together
let get_weather_tool = Tool::builder()
.name("get-weather")
.description("Get the weather information given a location")
.input_schema(schema.clone())
.execute(func)
.build()
.unwrap();Registering Manually-Defined Tools
To register manually-defined tools with the AI model, you need to add them to LanguageModelRequest builder using the with_tool method.
This appends the tool to the list of tools that the AI model will use.
// call the model with `Tool` struct
let result = LanguageModelRequest::builder()
.model(OpenAI::new("gpt-4o"))
.system("You are a helpful assistant with access to tools.")
.prompt("What is the weather in New York?")
.with_tool(get_weather_tool) // you don't need to call it if using structs.
.build()
.generate_text()
.await;Next Steps
- Use tools in a loop with Agents.
- Learn how to extract Structured Output.
- Explore the full Language Model Request API.