Getting Started

Basic Usage

AISDK provides a simple and intuitive API for interacting with AI models.

You can easily perform tasks such as:

You can explore all available core features here.

Basic Text Generation & Streaming

This section provides examples for generating text with aisdk using a model provider (e.g., OpenAI).

Installation

Install aisdk with the OpenAI provider feature or you can use any other provider you want.

cargo add aisdk --features openai

Once you have enabled the OpenAI provider, you can use all aisdk features with it.

Generate Text

Generating text using openai as a provider you can find more info about here

use aisdk::{
    core::LanguageModelRequest,
    providers::openai::OpenAI,
};

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
	let text = LanguageModelRequest::builder()
		.model(OpenAI::new("gpt-5"))
		.prompt("Write a short poem about Rust.")
		.build()
		.generate_text()
		.await?
		.text()?;

	println!("Model output: {}", text);
	Ok(())
}

Streaming Text

Streaming text generation using openai as a provider you can find more info about here

use aisdk::{
	core::{LanguageModelRequest, LanguageModelStreamChunkType},
	providers::openai::OpenAI,
};
use futures::StreamExt;

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
	let mut stream = LanguageModelRequest::builder()
			.model(OpenAI::new("gpt-5"))
			.prompt("Write a short poem about Rust.")
			.build()
			.stream_text()
			.await?
			.stream;

	while let Some(chunk) = stream.next().await {
		if let LanguageModelStreamChunkType::Text(text) = chunk {
			println!("Streaming text: {}", text);
		}
	}

	Ok(())
}
Basic Usage