Submodel
AISDK provides first-class support for Submodel with fully typed model APIs. Model capabilities are enforced at compile time using Rust's type system. This prevents model capability mismatches and guarantees the selected model is valid for the task (e.g. tool calling).
Installation
Enable the Submodel provider feature:
cargo add aisdk --features submodelThis installs AISDK with the Submodel provider enabled. Once you have enabled the Submodel provider, you can use all aisdk features with it.
Create a Provider Instance
To create a provider instance, call Submodel::model_name(), where model_name is the Submodel model you want to use.
Model names are exposed as snake-case methods.
use aisdk::providers::Submodel;
let submodel = Submodel::qwen_qwen3_235b_a22b_instruct_2507();This initializes the provider with:
- Model:
"Qwen/Qwen3-235B-A22B-Instruct-2507" - API key from environment (if set with
SUBMODEL_INSTAGEN_ACCESS_KEY) - Submodel's default base URL (https://llm.submodel.ai/v1)
Basic Text Generation
Example using LanguageModelRequest for text generation.
use aisdk::{
core::LanguageModelRequest,
providers::Submodel,
};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let submodel = Submodel::qwen_qwen3_235b_a22b_instruct_2507();
let response = LanguageModelRequest::builder()
.model(submodel)
.prompt("Write a short poem about Rust.")
.build()
.generate_text()
.await?;
println!("Response text: {:?}", response.text());
Ok(())
}Provider Settings
You can customize provider configuration using Submodel::builder()
API Key
let submodel = Submodel::<QwenQwen3235bA22bInstruct2507>::builder()
.api_key("your-api-key")
.build()?;If not specified, AISDK uses the SUBMODEL_INSTAGEN_ACCESS_KEY environment variable.
Base URL
Useful when routing through a proxy, gateway, or self-hosted compatible endpoint.
let submodel = Submodel::<QwenQwen3235bA22bInstruct2507>::builder()
.base_url("https://llm.submodel.ai/v1")
.build()?;Path (Full URL Override)
Use .path(...) to override the full request URL instead of only the base URL.
let submodel = Submodel::<QwenQwen3235bA22bInstruct2507>::builder()
.path("https://full-url.example/v1/chat/completions")
.build()?;Provider Name
For logging, analytics, and observability.
let submodel = Submodel::<QwenQwen3235bA22bInstruct2507>::builder()
.provider_name("Submodel")
.build()?;Full Custom Configuration Example
let submodel = Submodel::<QwenQwen3235bA22bInstruct2507>::builder()
.api_key("your-api-key")
.base_url("https://llm.submodel.ai/v1")
.path("https://full-url.example/v1/chat/completions")
.provider_name("Submodel")
.build()?;Dynamic Model Selection
For runtime model selection (e.g., loading models from config files), use DynamicModel:
Using model_name() Method with Default Settings
use aisdk::providers::Submodel;
// Specify model as a string at runtime
let submodel = Submodel::model_name("Qwen/Qwen3-235B-A22B-Instruct-2507");Using Builder Pattern with Custom Settings
use aisdk::{
core::DynamicModel,
providers::Submodel,
};
let submodel = Submodel::<DynamicModel>::builder()
.model_name("Qwen/Qwen3-235B-A22B-Instruct-2507")
.api_key("your-api-key")
.base_url("https://llm.submodel.ai/v1")
.path("https://full-url.example/v1/chat/completions")
.provider_name("Submodel")
.build()?;Warning: When using DynamicModel, model capabilities are not validated at compile time.
This means there's no guarantee the model supports requested features (e.g., tool calls, structured output).
For compile-time safety, use the typed methods like Submodel::qwen_qwen3_235b_a22b_instruct_2507().
Next Steps
- Take a deeper look at text generation features Generating Text / Streaming Text
- Explore Structured Output for reliable agent data.
- Learn how to create Custom Tools.
- Learn more about Agents.