// Docs

Use

Once the Community stack is up, usage is straightforward.

Daily flow

  1. keep the stack running
  2. log in to the dashboard at http://localhost:18080 when you need visibility
  3. set the LLM base URL to http://localhost:18082
  4. use the app normally

Client setup

For OpenAI-compatible tools:

OPENAI_BASE_URL=http://localhost:18082

Anthropic-style and Gemini-style clients should use the same local proxy host with their compatible request paths.

What to watch in the dashboard

  • live Audit Logs
  • detection outcomes
  • current policies
  • Dry Run behavior before enabling stronger enforcement

Important distinction

  • dashboard login uses the bootstrap tenant API key
  • model requests use your app's normal provider authentication flow through the proxy