Groq
AISDK provides first-class support for Groq with fully typed model APIs. Model capabilities are enforced at compile time using Rust's type system. This prevents model capability mismatches and guarantees the selected model is valid for the task (e.g. tool calling).
Installation
Enable the Groq provider feature:
cargo add aisdk --features groqThis installs AISDK with the Groq provider enabled. Once you have enabled the Groq provider, you can use all aisdk features with it.
Create a Provider Instance
To create a provider instance, call Groq::model_name(), where model_name is the Groq model you want to use.
Model names are exposed as snake-case methods.
use aisdk::providers::Groq;
let groq = Groq::llama_3_1_8b_instruct();This initializes the provider with:
- Model:
"llama-3.1-8b-instruct" - API key from environment (if set with
GROQ_API_KEY) - Groq's default base URL (https://api.groq.com/openai/)
Basic Text Generation
Example using LanguageModelRequest for text generation.
use aisdk::{
core::LanguageModelRequest,
providers::Groq,
};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let groq = Groq::llama_3_1_8b_instruct();
let response = LanguageModelRequest::builder()
.model(groq)
.prompt("Write a short poem about Rust.")
.build()
.generate_text()
.await?;
println!("Response text: {:?}", response.text());
Ok(())
}Provider Settings
You can customize provider configuration using Groq::builder()
API Key
let groq = Groq::<MetaLlamaLlama318bInstructV10>::builder()
.api_key("your-api-key")
.build()?;If not specified, AISDK uses the GROQ_API_KEY environment variable.
Base URL
Useful when routing through a proxy, gateway, or self-hosted compatible endpoint.
let groq = Groq::<MetaLlamaLlama318bInstructV10>::builder()
.base_url("https://api.groq.com/openai/")
.build()?;Provider Name
For logging, analytics, and observability.
let groq = Groq::<MetaLlamaLlama318bInstructV10>::builder()
.provider_name("Groq")
.build()?;Full Custom Configuration Example
let groq = Groq::<MetaLlamaLlama318bInstructV10>::builder()
.api_key("your-api-key")
.base_url("https://api.groq.com/openai/")
.provider_name("Groq")
.build()?;Dynamic Model Selection
For runtime model selection (e.g., loading models from config files), use DynamicModel:
Using model_name() Method with Default Settings
use aisdk::providers::Groq;
// Specify model as a string at runtime
let groq = Groq::model_name("llama-3.1-8b-instruct");Using Builder Pattern with Custom Settings
use aisdk::{
core::DynamicModel,
providers::Groq,
};
let groq = Groq::<DynamicModel>::builder()
.model_name("llama-3.1-8b-instruct")
.api_key("your-api-key")
.base_url("https://api.groq.com/openai/")
.build()?;Warning: When using DynamicModel, model capabilities are not validated at compile time.
This means there's no guarantee the model supports requested features (e.g., tool calls, structured output).
For compile-time safety, use the typed methods like Groq::llama_3_1_8b_instruct().
Next Steps
- Take a deeper look at text generation features Generating Text / Streaming Text
- Explore Structured Output for reliable agent data.
- Learn how to create Custom Tools.
- Learn more about Agents.