fix(api/who): use /v1/chat/completions path for mana-llm

The who module's chat endpoint was returning 502 to the browser
because mana-api called /api/v1/chat/completions on mana-llm and
got 404 — mana-llm exposes the OpenAI-compatible /v1/chat/completions
path with no /api/ prefix.

This is the same bug research had until commit 63a91e36a fixed its
path. The chat module (apps/api/src/modules/chat/routes.ts) still
has the wrong path — flagged as a follow-up.

Diagnostic from inside the mana-api container:
  /v1/chat/completions       → 422 (right path, empty body)
  /api/v1/chat/completions   → 404 (wrong path)

mana-api log line that flagged it:
  who.llm_non_200 status:404

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
Till JS 2026-04-09 15:48:09 +02:00
parent 23f13d7139
commit 51f408755c

View file

@ -147,7 +147,10 @@ routes.post('/chat', async (c) => {
// LLM-gateway improvement applies here too.
let llmRes: Response;
try {
llmRes = await fetch(`${LLM_URL}/api/v1/chat/completions`, {
// mana-llm exposes /v1/chat/completions (OpenAI-compatible path,
// no /api/ prefix). The chat module had the same bug before commit
// 63a91e36a fixed research's path; this is the same correction.
llmRes = await fetch(`${LLM_URL}/v1/chat/completions`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({