Providers
OpenAI
AISDK includes first-class support for the OpenAI.
Installation
Enable the OpenAI provider feature:
cargo add aisdk --features openaiThis installs AISDK with the OpenAI provider enabled.
Once you have enabled the OpenAI provider, you can use all aisdk features with it.
Quick Start
Create an OpenAI provider instance with default settings:
use aisdk::providers::openai::OpenAI;
let openai = OpenAI::new("gpt-5");This initializes the provider with:
- Model:
"gpt-5" - API key from environment (if set with
OPENAI_API_KEY) - OpenAI’s default base URL
Basic Text Generation
use aisdk::{
core::LanguageModelRequest,
providers::openai::OpenAI,
};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Initialize provider with default settings.
let openai = OpenAI::new("gpt-5");
let response = LanguageModelRequest::builder()
.model(openai)
.prompt("Write a short poem about Rust.")
.build()
.generate_text()
.await?
.text()?;
println!("Model output: {}", response);
Ok(())
}Provider Settings
You can customize provider configuration using OpenAI::builder()
API Key
let openai = OpenAI::builder()
.api_key("your-api-key")
.build()?;If not specified, AISDK uses the OPENAI_API_KEY environment variable.
Base URL
Useful when routing through a proxy, gateway, or self-hosted compatible endpoint.
let openai = OpenAI::builder()
.base_url("https://api.openai.com/v1")
.build()?;Provider Name
For logging, analytics, and observability.
let openai = OpenAI::builder()
.provider_name("OpenAI")
.build()?;Model Name
Set a default model for all requests using this provider instance:
let openai = OpenAI::builder()
.model_name("gpt-4o")
.build()?;Full Custom Configuration Example
let openai = OpenAI::builder()
.api_key("your-api-key")
.base_url("https://api.openai.com/v1")
.provider_name("OpenAI")
.model_name("gpt-4o")
.build()?;