Deployment
Bring Your Own Model
Configure custom AI model providers — Anthropic, OpenAI, Google, or self-hosted endpoints.
Bring Your Own Model (BYOM)
QANATIX uses AI models for features like chat, description generation, and data extraction. By default, QANATIX uses its own managed model. With BYOM, you can configure your own model provider.
Supported providers
| Provider | Models | Config key |
|---|---|---|
| Anthropic | Claude 4 Opus, Claude 4 Sonnet, Claude 3.5 Haiku | anthropic |
| OpenAI | GPT-4o, GPT-4o-mini, o3-mini | openai |
| Gemini 2.5 Pro, Gemini 2.5 Flash | google | |
| Custom endpoint | Any OpenAI-compatible API | custom |
Configure via dashboard
- Go to Settings > Models in the QANATIX dashboard
- Select a provider
- Enter your API key
- Choose a default model
- Save
The API key is encrypted at rest and never exposed after saving.
Configure via API
curl -X PATCH https://api.qanatix.com/api/v1/portal/account \
-H "Authorization: Bearer <JWT>" \
-H "Content-Type: application/json" \
-d '{
"model_provider": "anthropic",
"model_api_key": "sk-ant-...",
"model_name": "claude-sonnet-4-20250514"
}'Parameters
| Field | Type | Required | Description |
|---|---|---|---|
model_provider | string | yes | anthropic, openai, google, or custom |
model_api_key | string | yes | API key for the provider |
model_name | string | no | Specific model ID. Falls back to provider default if omitted. |
model_endpoint | string | no | Custom endpoint URL (required for custom provider) |
Custom model endpoints
For self-hosted or third-party OpenAI-compatible APIs (e.g., vLLM, Ollama, Azure OpenAI):
curl -X PATCH https://api.qanatix.com/api/v1/portal/account \
-H "Authorization: Bearer <JWT>" \
-H "Content-Type: application/json" \
-d '{
"model_provider": "custom",
"model_api_key": "your-key",
"model_name": "my-model",
"model_endpoint": "https://llm.internal.company.com/v1"
}'The endpoint must be OpenAI-compatible (/v1/chat/completions).
Default model selection
If no BYOM is configured:
- Cloud (managed): QANATIX uses its own managed model — no API key needed from you.
- Self-hosted: You must configure a model provider. QANATIX cannot make external API calls from air-gapped deployments without an explicit provider configuration.
Self-hosted vs cloud
| Behavior | Cloud | Self-hosted |
|---|---|---|
| Default model | Managed by QANATIX | Must configure BYOM |
| API key storage | Encrypted in managed DB | Encrypted in your Postgres |
| External calls | QANATIX proxy | Direct from your infra |
| Air-gapped support | N/A | Use custom provider pointing to local endpoint |
Verify configuration
After configuring, test with a chat query:
curl -X POST https://api.qanatix.com/api/v1/chat \
-H "Authorization: Bearer sk_live_..." \
-H "Content-Type: application/json" \
-d '{"message": "How many records do I have?"}'If the model provider is misconfigured, you will get a 502 error with details about the upstream failure.