Providers

NanoGpt

AISDK provides first-class support for NanoGpt with fully typed model APIs. Model capabilities are enforced at compile time using Rust's type system. This prevents model capability mismatches and guarantees the selected model is valid for the task (e.g. tool calling).

Installation

Enable the NanoGpt provider feature:

cargo add aisdk --features nano-gpt

This installs AISDK with the NanoGpt provider enabled. Once you have enabled the NanoGpt provider, you can use all aisdk features with it.

Create a Provider Instance

To create a provider instance, call NanoGpt::model_name(), where model_name is the NanoGpt model you want to use. Model names are exposed as snake-case methods.

use aisdk::providers::NanoGpt;

let nano-gpt = NanoGpt::deepseek_deepseek_r1();

This initializes the provider with:

  • Model: "deepseek/deepseek-r1"
  • API key from environment (if set with NANO_GPT_API_KEY)
  • NanoGpt's default base URL (https://nano-gpt.com/api/v1)

Basic Text Generation

Example using LanguageModelRequest for text generation.

use aisdk::{
    core::LanguageModelRequest,
    providers::NanoGpt,
};

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {

    let nano-gpt = NanoGpt::deepseek_deepseek_r1();

    let response = LanguageModelRequest::builder()
        .model(nano-gpt)
        .prompt("Write a short poem about Rust.")
        .build()
        .generate_text()
        .await?;

    println!("Response text: {:?}", response.text());
    Ok(())
}

Provider Settings

You can customize provider configuration using NanoGpt::builder()

API Key

let nano-gpt = NanoGpt::<DeepseekDeepseekR1>::builder()
    .api_key("your-api-key")
    .build()?;

If not specified, AISDK uses the NANO_GPT_API_KEY environment variable.

Base URL

Useful when routing through a proxy, gateway, or self-hosted compatible endpoint.

let nano-gpt = NanoGpt::<DeepseekDeepseekR1>::builder()
    .base_url("https://nano-gpt.com/api/v1")
    .build()?;

Path (Full URL Override)

Use .path(...) to override the full request URL instead of only the base URL.

let nano-gpt = NanoGpt::<DeepseekDeepseekR1>::builder()
    .path("https://full-url.example/v1/chat/completions")
    .build()?;

Provider Name

For logging, analytics, and observability.

let nano-gpt = NanoGpt::<DeepseekDeepseekR1>::builder()
    .provider_name("NanoGpt")
    .build()?;

Full Custom Configuration Example

let nano-gpt = NanoGpt::<DeepseekDeepseekR1>::builder()
    .api_key("your-api-key")
    .base_url("https://nano-gpt.com/api/v1")
    .path("https://full-url.example/v1/chat/completions")
    .provider_name("NanoGpt")
    .build()?;

Dynamic Model Selection

For runtime model selection (e.g., loading models from config files), use DynamicModel:

Using model_name() Method with Default Settings

use aisdk::providers::NanoGpt;

// Specify model as a string at runtime
let nano-gpt = NanoGpt::model_name("deepseek/deepseek-r1");

Using Builder Pattern with Custom Settings

use aisdk::{
    core::DynamicModel,
    providers::NanoGpt,
};

let nano-gpt = NanoGpt::<DynamicModel>::builder()
	.model_name("deepseek/deepseek-r1")
	.api_key("your-api-key")
	.base_url("https://nano-gpt.com/api/v1")
	.path("https://full-url.example/v1/chat/completions")
	.provider_name("NanoGpt")
	.build()?;

Warning: When using DynamicModel, model capabilities are not validated at compile time. This means there's no guarantee the model supports requested features (e.g., tool calls, structured output). For compile-time safety, use the typed methods like NanoGpt::deepseek_deepseek_r1().

Next Steps