// Docs

Connect through the proxy

Use http://localhost:18082 as your app's LLM base URL.

Main endpoint

http://localhost:18082

Supported client style

The Community proxy is intended to work with:

  • OpenAI-compatible clients
  • Anthropic-compatible clients
  • Gemini-compatible clients

Typical OpenAI-compatible setup

OPENAI_BASE_URL=http://localhost:18082

Some clients call the same setting OPENAI_API_BASE.

Login key vs provider key

There are two different keys in the Community flow:

  • the bootstrap tenant API key is for logging into the dashboard on http://localhost:18080
  • your provider API key stays in your AI app or IDE for model requests

Quick check

If your app still works after switching its LLM base URL to http://localhost:18082, Cencurity is in the traffic path.