fix(api): set supportsStructuredOutputs=true on mana-llm provider

generateObject() in the AI SDK falls back to a tool-call mode when the
provider doesn't advertise structured-output support — and tool calling
through Ollama isn't reliable enough that the schema-validation step
passes. The response was failing with 'No object generated: response
did not match schema' even though the underlying mana-llm + Ollama
roundtrip works correctly when called with response_format directly
(verified via curl).

Set supportsStructuredOutputs:true on the createOpenAICompatible
factory so the AI SDK uses response_format json_schema mode. mana-llm
already routes that to Ollama's native format field thanks to the
companion fix in services/mana-llm/src/providers/ollama.py — verified
end-to-end with the MealAnalysisSchema and Gemma 3 4B.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
Till JS 2026-04-09 19:44:13 +02:00
parent f3effe9390
commit 55bf493f44
2 changed files with 10 additions and 0 deletions

View file

@ -45,6 +45,14 @@ const llm = createOpenAICompatible({
// src/main.py:125). The AI SDK's openai-compatible adapter appends
// /chat/completions to baseURL, so baseURL ends in /v1.
baseURL: `${LLM_URL}/v1`,
// Tell the AI SDK that mana-llm honours OpenAI-style strict
// json_schema response_format. Without this, generateObject() falls
// back to a tool-call mode that Ollama-backed models don't support
// reliably and the response fails to validate against the Zod schema.
// mana-llm's Ollama provider translates response_format → Ollama's
// native `format` field (services/mana-llm/src/providers/ollama.py)
// so this is honoured end-to-end.
supportsStructuredOutputs: true,
});
const ANALYSIS_PROMPT = `Du bist ein Ernährungsexperte. Analysiere die Mahlzeit und gib strukturierte Nährwertdaten zurück. Schätze realistische Portionsgrößen und Kalorien. Antworte auf Deutsch.`;

View file

@ -28,6 +28,8 @@ const VISION_MODEL = process.env.VISION_MODEL || 'ollama/gemma3:4b';
const llm = createOpenAICompatible({
name: 'mana-llm',
baseURL: `${LLM_URL}/v1`,
// See nutriphi/routes.ts for the rationale on this flag.
supportsStructuredOutputs: true,
});
const IDENTIFICATION_PROMPT = `Du bist ein Pflanzenexperte. Analysiere das Pflanzenfoto und liefere eine strukturierte Identifikation mit lateinischem Namen, deutschen Trivialnamen, Pflegehinweisen und einer Gesundheitseinschätzung. Antworte auf Deutsch.`;