fix: route LLM calls through OpenWebUI tracked proxy for analytics
Changed LLM_API_URL and LLM_FALLBACK_URL from /api/v1 to /api so requests hit OpenWebUI's /api/chat/completions (tracked) instead of /api/v1/chat/completions (passthrough with no analytics).
This commit is contained in:
parent
ff351b38d7
commit
c6a1d32498
1 changed files with 3 additions and 2 deletions
|
|
@ -10,10 +10,11 @@ POSTGRES_DB=chrysopedia
|
|||
REDIS_URL=redis://chrysopedia-redis:6379/0
|
||||
|
||||
# LLM endpoint (OpenAI-compatible — OpenWebUI on FYN DGX)
|
||||
LLM_API_URL=https://chat.forgetyour.name/api/v1
|
||||
# Use /api (not /api/v1) so calls route through OpenWebUI's tracked proxy for analytics
|
||||
LLM_API_URL=https://chat.forgetyour.name/api
|
||||
LLM_API_KEY=sk-changeme
|
||||
LLM_MODEL=fyn-llm-agent-chat
|
||||
LLM_FALLBACK_URL=https://chat.forgetyour.name/api/v1
|
||||
LLM_FALLBACK_URL=https://chat.forgetyour.name/api
|
||||
LLM_FALLBACK_MODEL=fyn-llm-agent-chat
|
||||
|
||||
# Per-stage LLM model overrides (optional — defaults to LLM_MODEL)
|
||||
|
|
|
|||
Loading…
Add table
Reference in a new issue