fix: route LLM calls through OpenWebUI tracked proxy for analytics

Changed LLM_API_URL and LLM_FALLBACK_URL from /api/v1 to /api so
requests hit OpenWebUI's /api/chat/completions (tracked) instead of
/api/v1/chat/completions (passthrough with no analytics).
This commit is contained in:
jlightner 2026-04-03 08:27:53 +00:00
parent ff351b38d7
commit c6a1d32498

View file

@ -10,10 +10,11 @@ POSTGRES_DB=chrysopedia
REDIS_URL=redis://chrysopedia-redis:6379/0 REDIS_URL=redis://chrysopedia-redis:6379/0
# LLM endpoint (OpenAI-compatible — OpenWebUI on FYN DGX) # LLM endpoint (OpenAI-compatible — OpenWebUI on FYN DGX)
LLM_API_URL=https://chat.forgetyour.name/api/v1 # Use /api (not /api/v1) so calls route through OpenWebUI's tracked proxy for analytics
LLM_API_URL=https://chat.forgetyour.name/api
LLM_API_KEY=sk-changeme LLM_API_KEY=sk-changeme
LLM_MODEL=fyn-llm-agent-chat LLM_MODEL=fyn-llm-agent-chat
LLM_FALLBACK_URL=https://chat.forgetyour.name/api/v1 LLM_FALLBACK_URL=https://chat.forgetyour.name/api
LLM_FALLBACK_MODEL=fyn-llm-agent-chat LLM_FALLBACK_MODEL=fyn-llm-agent-chat
# Per-stage LLM model overrides (optional — defaults to LLM_MODEL) # Per-stage LLM model overrides (optional — defaults to LLM_MODEL)