chore(env): default MANA_LLM_URL to llm.mana.how

Same convention as STT_URL — nobody runs mana-llm in local Docker for
dev work, the shared gateway is always reachable, so the path of least
friction is to point at it by default. Devs who want a fully offline
stack can still override the var locally.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
Till JS 2026-04-08 16:55:01 +02:00
parent b505024f7b
commit 68e8897c9c

View file

@ -169,8 +169,12 @@ OLLAMA_URL=http://localhost:11434
# mana-llm (OpenAI-compatible gateway, port 3025 locally / llm.mana.how prod)
# Used by server-side voice quick-add proxies (parse-task, parse-habit).
# API key is required when pointing at the GPU LLM proxy (gpu-llm.mana.how).
MANA_LLM_URL=http://localhost:3025
# Defaults to the shared dev gateway because nobody runs mana-llm in
# local Docker — same convention as STT_URL above. If you want a fully
# offline local stack, override this to http://localhost:3025 and run
# `docker compose up mana-llm`. API key is required when pointing at
# the GPU LLM proxy (gpu-llm.mana.how).
MANA_LLM_URL=https://llm.mana.how
MANA_LLM_API_KEY=
# ============================================