QANATIX
Deployment

Bring Your Own Model

Configure custom AI model providers — Anthropic, OpenAI, Google, or self-hosted endpoints.

Bring Your Own Model (BYOM)

QANATIX uses AI models for features like chat, description generation, and data extraction. By default, QANATIX uses its own managed model. With BYOM, you can configure your own model provider.

Supported providers

ProviderModelsConfig key
AnthropicClaude 4 Opus, Claude 4 Sonnet, Claude 3.5 Haikuanthropic
OpenAIGPT-4o, GPT-4o-mini, o3-miniopenai
GoogleGemini 2.5 Pro, Gemini 2.5 Flashgoogle
Custom endpointAny OpenAI-compatible APIcustom

Configure via dashboard

  1. Go to Settings > Models in the QANATIX dashboard
  2. Select a provider
  3. Enter your API key
  4. Choose a default model
  5. Save

The API key is encrypted at rest and never exposed after saving.

Configure via API

curl -X PATCH https://api.qanatix.com/api/v1/portal/account \
  -H "Authorization: Bearer <JWT>" \
  -H "Content-Type: application/json" \
  -d '{
    "model_provider": "anthropic",
    "model_api_key": "sk-ant-...",
    "model_name": "claude-sonnet-4-20250514"
  }'

Parameters

FieldTypeRequiredDescription
model_providerstringyesanthropic, openai, google, or custom
model_api_keystringyesAPI key for the provider
model_namestringnoSpecific model ID. Falls back to provider default if omitted.
model_endpointstringnoCustom endpoint URL (required for custom provider)

Custom model endpoints

For self-hosted or third-party OpenAI-compatible APIs (e.g., vLLM, Ollama, Azure OpenAI):

curl -X PATCH https://api.qanatix.com/api/v1/portal/account \
  -H "Authorization: Bearer <JWT>" \
  -H "Content-Type: application/json" \
  -d '{
    "model_provider": "custom",
    "model_api_key": "your-key",
    "model_name": "my-model",
    "model_endpoint": "https://llm.internal.company.com/v1"
  }'

The endpoint must be OpenAI-compatible (/v1/chat/completions).

Default model selection

If no BYOM is configured:

  • Cloud (managed): QANATIX uses its own managed model — no API key needed from you.
  • Self-hosted: You must configure a model provider. QANATIX cannot make external API calls from air-gapped deployments without an explicit provider configuration.

Self-hosted vs cloud

BehaviorCloudSelf-hosted
Default modelManaged by QANATIXMust configure BYOM
API key storageEncrypted in managed DBEncrypted in your Postgres
External callsQANATIX proxyDirect from your infra
Air-gapped supportN/AUse custom provider pointing to local endpoint

Verify configuration

After configuring, test with a chat query:

curl -X POST https://api.qanatix.com/api/v1/chat \
  -H "Authorization: Bearer sk_live_..." \
  -H "Content-Type: application/json" \
  -d '{"message": "How many records do I have?"}'

If the model provider is misconfigured, you will get a 502 error with details about the upstream failure.

On this page