// Docs
Use
Once the Community stack is up, usage is straightforward.
Daily flow
- keep the stack running
- log in to the dashboard at
http://localhost:18080when you need visibility - set the LLM base URL to
http://localhost:18082 - use the app normally
Client setup
For OpenAI-compatible tools:
OPENAI_BASE_URL=http://localhost:18082
Anthropic-style and Gemini-style clients should use the same local proxy host with their compatible request paths.
What to watch in the dashboard
- live Audit Logs
- detection outcomes
- current policies
- Dry Run behavior before enabling stronger enforcement
Important distinction
- dashboard login uses the bootstrap tenant API key
- model requests use your app's normal provider authentication flow through the proxy
