OpenAI Compatible Provider
OpenAiCompatProvider implements the OpenAI Chat Completions API. One implementation covers OpenAI, xAI, Groq, Cerebras, OpenRouter, Mistral, DeepSeek, and any other compatible API.
Usage
Requires a ModelConfig with compat flags set in StreamConfig.model_config:
#![allow(unused)] fn main() { use yoagent::provider::{OpenAiCompatProvider, ModelConfig}; let agent = Agent::new(OpenAiCompatProvider) .with_model("gpt-4o") .with_api_key(std::env::var("OPENAI_API_KEY").unwrap()); }
OpenAiCompat Quirk Flags
Different providers have behavioral differences even though they share the same API:
#![allow(unused)] fn main() { pub struct OpenAiCompat { pub supports_store: bool, pub supports_developer_role: bool, pub supports_reasoning_effort: bool, pub supports_usage_in_streaming: bool, pub max_tokens_field: MaxTokensField, // MaxTokens or MaxCompletionTokens pub requires_tool_result_name: bool, pub requires_assistant_after_tool_result: bool, pub thinking_format: ThinkingFormat, // OpenAi, Xai, or Qwen } }
Provider Presets
| Provider | Constructor | Key Differences |
|---|---|---|
| OpenAI | OpenAiCompat::openai() | developer role, max_completion_tokens, store, reasoning_effort |
| xAI (Grok) | OpenAiCompat::xai() | reasoning field for thinking (not reasoning_content) |
| Groq | OpenAiCompat::groq() | Standard defaults |
| Cerebras | OpenAiCompat::cerebras() | Standard defaults |
| OpenRouter | OpenAiCompat::openrouter() | max_completion_tokens |
| Mistral | OpenAiCompat::mistral() | max_tokens field |
| DeepSeek | OpenAiCompat::deepseek() | max_completion_tokens |
Adding a New Compatible Provider
- Add a constructor to
OpenAiCompat:
#![allow(unused)] fn main() { impl OpenAiCompat { pub fn my_provider() -> Self { Self { supports_usage_in_streaming: true, // set flags as needed... ..Default::default() } } } }
- Create a
ModelConfigthat uses it:
#![allow(unused)] fn main() { let config = ModelConfig { id: "my-model".into(), name: "My Model".into(), api: ApiProtocol::OpenAiCompletions, provider: "my-provider".into(), base_url: "https://api.myprovider.com/v1".into(), compat: Some(OpenAiCompat::my_provider()), // ... }; }
Thinking/Reasoning
The ThinkingFormat enum controls how reasoning content is parsed from streams:
ThinkingFormat::OpenAi— Usesreasoning_contentfield (DeepSeek, default)ThinkingFormat::Xai— Usesreasoningfield (Grok)ThinkingFormat::Qwen— Usesreasoning_contentfield (Qwen)
Auth
Uses Authorization: Bearer {api_key} header. Extra headers can be added via ModelConfig.headers.