← Back to Blog

Provider integrations: Connect OpenAI, Anthropic, Voyage AI, and more

Quantlix Team

Quantlix doesn't run models itself — it orchestrates them. You connect providers (OpenAI, Anthropic, Voyage AI, Azure OpenAI, or self-hosted), and Quantlix handles deployment, inference, and policies.

Provider = credentials + models

A provider is an org-scoped connection to a model API. You add a provider, set your API key (or other credential), and sync the available models. Credentials are encrypted at rest. Once synced, you bind deployments to specific provider models.

Quick setup

1. Create an org (if you don't have one)

2. Go to **Dashboard → Providers** and click **Add Provider**

3. Choose the type: Voyage AI, OpenAI, Anthropic, Azure OpenAI, etc.

4. Paste your API key

5. Click **Sync Models** to pull the available models

6. When creating a deployment, bind it to a provider model in the Inference target section

Use cases

**Chat inference** — Bind a deployment to OpenAI or Anthropic for text generation. Quantlix handles the API calls, retries, and guardrails.

**Embeddings** — Voyage AI is popular for embeddings. Add a Voyage provider, sync models, and use it for your Knowledge/RAG pipelines. Embeddings power semantic search and retrieval.

**RAG** — Combine an embedding provider with a knowledge base. Quantlix retrieves relevant chunks and generates answers with citations.

Self-hosted

You can add self-hosted providers. Point Quantlix at your own model endpoint, set the credential, and it works like any other provider. Useful for on-prem or private models.