Providers Overview

yoagent supports multiple LLM providers through the StreamProvider trait and ApiProtocol dispatch.

Supported Protocols

ProtocolProvider StructAPI Format
AnthropicMessagesAnthropicProviderAnthropic Messages API
OpenAiCompletionsOpenAiCompatProviderOpenAI Chat Completions
OpenAiResponsesOpenAiResponsesProviderOpenAI Responses API
AzureOpenAiResponsesAzureOpenAiProviderAzure OpenAI Responses
GoogleGenerativeAiGoogleProviderGoogle Gemini API
GoogleVertexGoogleVertexProviderGoogle Vertex AI
BedrockConverseStreamBedrockProviderAWS Bedrock ConverseStream

ApiProtocol Enum

#![allow(unused)]
fn main() {
pub enum ApiProtocol {
    AnthropicMessages,
    OpenAiCompletions,
    OpenAiResponses,
    AzureOpenAiResponses,
    GoogleGenerativeAi,
    GoogleVertex,
    BedrockConverseStream,
}
}

ModelConfig

Full configuration for a model, including provider routing:

#![allow(unused)]
fn main() {
pub struct ModelConfig {
    pub id: String,              // e.g. "gpt-4o"
    pub name: String,            // e.g. "GPT-4o"
    pub api: ApiProtocol,        // Which provider to use
    pub provider: String,        // e.g. "openai"
    pub base_url: String,        // API endpoint
    pub reasoning: bool,         // Supports thinking/reasoning
    pub context_window: u32,     // Context size in tokens
    pub max_tokens: u32,         // Default max output
    pub cost: CostConfig,        // Pricing per million tokens
    pub headers: HashMap<String, String>,  // Extra headers
    pub compat: Option<OpenAiCompat>,      // Quirk flags
}
}

Convenience constructors:

#![allow(unused)]
fn main() {
let anthropic = ModelConfig::anthropic("claude-sonnet-4-20250514", "Claude Sonnet 4");
let openai = ModelConfig::openai("gpt-4o", "GPT-4o");
let google = ModelConfig::google("gemini-2.0-flash", "Gemini 2.0 Flash");
}

ProviderRegistry

Maps ApiProtocolStreamProvider. The default registry includes all built-in providers:

#![allow(unused)]
fn main() {
let registry = ProviderRegistry::default();

// Use it to stream with any model
let result = registry.stream(&model_config, stream_config, tx, cancel).await?;
}

Custom registries:

#![allow(unused)]
fn main() {
let mut registry = ProviderRegistry::new();
registry.register(ApiProtocol::AnthropicMessages, AnthropicProvider);
}

StreamProvider Trait

#![allow(unused)]
fn main() {
#[async_trait]
pub trait StreamProvider: Send + Sync {
    async fn stream(
        &self,
        config: StreamConfig,
        tx: mpsc::UnboundedSender<StreamEvent>,
        cancel: CancellationToken,
    ) -> Result<Message, ProviderError>;
}
}

All providers receive a StreamConfig, emit StreamEvents through the channel, and return the final Message.