Dory Docs
Reference

AI Providers

Dory uses a pluggable AI provider architecture for AI Chat, SQL generation, chart suggestions, and schema-aware analysis. You can switch providers through environment variables without changing application code.

Supported providers

ProviderDORY_AI_PROVIDER valueNotes
OpenAIopenaiUses the official OpenAI API.
OpenAI-compatibleopenai-compatibleFor services exposing an OpenAI-compatible API.
AnthropicanthropicClaude models via Anthropic's API.
GooglegoogleGemini models via Google Generative AI.
QwenqwenQwen models through a compatible endpoint.
xAIxaiGrok models via xAI API.

Core variables

Most deployments need the following values:

export DORY_AI_PROVIDER=openai
export DORY_AI_MODEL=gpt-4o-mini
export DORY_AI_API_KEY=your_api_key_here
export DORY_AI_URL=https://api.openai.com/v1
VariableRequiredDescription
DORY_AI_PROVIDERYesSelects the provider adapter.
DORY_AI_MODELYesSelects the model used by Dory AI features.
DORY_AI_API_KEYUsuallyAPI key or bearer token accepted by the provider.
DORY_AI_URLProvider-dependentBase URL for OpenAI-compatible providers or custom endpoints.

Do not put provider credentials in frontend code, checked-in .env files, screenshots, or support tickets.

Provider examples

OpenAI

DORY_AI_PROVIDER=openai
DORY_AI_MODEL=gpt-4o-mini
DORY_AI_API_KEY=sk-...
DORY_AI_URL=https://api.openai.com/v1

Use this path when you want the most direct documented setup for Dory AI SQL generation.

OpenAI-compatible

DORY_AI_PROVIDER=openai-compatible
DORY_AI_MODEL=your-model-name
DORY_AI_API_KEY=your_provider_key
DORY_AI_URL=https://your-compatible-endpoint.example.com/v1

Use this path when your organization already standardizes on an OpenAI-compatible API surface.

Anthropic

DORY_AI_PROVIDER=anthropic
DORY_AI_MODEL=claude-3-5-sonnet-latest
DORY_AI_API_KEY=your_anthropic_key

Use Anthropic when your team prefers Claude models for longer context reasoning or policy reasons.

Google

DORY_AI_PROVIDER=google
DORY_AI_MODEL=gemini-1.5-pro
DORY_AI_API_KEY=your_google_key

Use Google when your organization already runs Gemini workflows.

Qwen

DORY_AI_PROVIDER=qwen
DORY_AI_MODEL=qwen-plus
DORY_AI_API_KEY=your_qwen_key

Use Qwen when it is the preferred model family for your language, region, or cost profile.

xAI

DORY_AI_PROVIDER=xai
DORY_AI_MODEL=grok-2-latest
DORY_AI_API_KEY=your_xai_key

Use xAI when Grok models are part of your approved AI stack.

Choosing a provider

RequirementRecommended direction
Fastest standard setupStart with OpenAI.
Existing compatible endpoint or proxyUse OpenAI-compatible.
Long reasoning tasksCompare Anthropic and OpenAI models.
Existing Google AI usageUse Google.
Region, language, or cost fitCompare Qwen and other approved providers.
Internal compliance requirementChoose the provider already approved by your organization.

Validation checklist

  1. Confirm the provider value matches one of the documented DORY_AI_PROVIDER values.
  2. Confirm the model name exists for that provider.
  3. Confirm the API key has permission to call the selected model.
  4. If using a compatible endpoint, confirm DORY_AI_URL includes the correct base path.
  5. Restart the Dory server after changing provider variables.
  6. Test AI Chat with a small schema question before testing large SQL generation tasks.

Troubleshooting

SymptomLikely causeFix
AI Chat returns authentication errorsInvalid or missing API key.Rotate the key and update DORY_AI_API_KEY.
Model not foundWrong model name for the selected provider.Use a model name supported by that provider.
Compatible endpoint failsIncorrect base URL or path.Verify the provider's OpenAI-compatible base URL.
SQL quality is inconsistentModel has limited schema reasoning ability.Try a stronger model and include clearer table context.
Responses are slowProvider latency or large schema context.Use a faster model or narrow the database context.

Limitation

Provider support does not guarantee identical output quality. SQL generation quality will vary by model, schema complexity, and prompt clarity.

On this page