managarten/apps/api/src
Till JS 55bf493f44 fix(api): set supportsStructuredOutputs=true on mana-llm provider
generateObject() in the AI SDK falls back to a tool-call mode when the
provider doesn't advertise structured-output support — and tool calling
through Ollama isn't reliable enough that the schema-validation step
passes. The response was failing with 'No object generated: response
did not match schema' even though the underlying mana-llm + Ollama
roundtrip works correctly when called with response_format directly
(verified via curl).

Set supportsStructuredOutputs:true on the createOpenAICompatible
factory so the AI SDK uses response_format json_schema mode. mana-llm
already routes that to Ollama's native format field thanks to the
companion fix in services/mana-llm/src/providers/ollama.py — verified
end-to-end with the MealAnalysisSchema and Gemma 3 4B.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-09 19:44:13 +02:00
..
lib fix(research): use /v1/chat/completions for mana-llm (not /api/v1/) 2026-04-08 22:37:07 +02:00
modules fix(api): set supportsStructuredOutputs=true on mana-llm provider 2026-04-09 19:44:13 +02:00
index.ts feat(api): who module — LLM character-guessing endpoint cluster 2026-04-09 13:09:46 +02:00