mirror of
https://github.com/Memo-2023/mana-monorepo.git
synced 2026-05-14 20:21:09 +02:00
feat(mana/web): pass MANA_LLM_API_KEY from voice parse proxies
The /api/v1/voice/parse-task and /api/v1/voice/parse-habit endpoints forwarded transcripts to mana-llm without an X-API-Key header. This worked against the local mana-llm container (no auth) but silently fell back to the no-LLM path when pointed at gpu-llm.mana.how, which requires an API key — voice quick-add would look like it was running in degraded mode forever with no signal that auth was the cause. Now both endpoints read MANA_LLM_API_KEY from the server-side env and attach it as X-API-Key when present, mirroring the pattern already used by /api/v1/voice/transcribe for mana-stt. When the var is empty the header is omitted, so local Docker setups without auth still work. Plumbing: generate-env.mjs writes MANA_LLM_URL + MANA_LLM_API_KEY into apps/mana/apps/web/.env, .env.development gets the new keys with empty defaults, ENVIRONMENT_VARIABLES.md documents the gateway and where to get a key. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
parent
2514831a3b
commit
029c7973ef
5 changed files with 41 additions and 2 deletions
|
|
@ -167,6 +167,12 @@ OPENROUTER_API_KEY=sk-or-v1-5bcd6de8d88ed9b7211230892df44764b2013d57d4d3c14ec302
|
|||
# Or set to direct URL if Ollama is exposed (e.g., https://ollama.mana.how)
|
||||
OLLAMA_URL=http://localhost:11434
|
||||
|
||||
# mana-llm (OpenAI-compatible gateway, port 3025 locally / llm.mana.how prod)
|
||||
# Used by server-side voice quick-add proxies (parse-task, parse-habit).
|
||||
# API key is required when pointing at the GPU LLM proxy (gpu-llm.mana.how).
|
||||
MANA_LLM_URL=http://localhost:3025
|
||||
MANA_LLM_API_KEY=
|
||||
|
||||
# ============================================
|
||||
# MAERCHENZAUBER PROJECT
|
||||
# ============================================
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue