managarten/apps/api
Till JS 55bf493f44 fix(api): set supportsStructuredOutputs=true on mana-llm provider
generateObject() in the AI SDK falls back to a tool-call mode when the
provider doesn't advertise structured-output support — and tool calling
through Ollama isn't reliable enough that the schema-validation step
passes. The response was failing with 'No object generated: response
did not match schema' even though the underlying mana-llm + Ollama
roundtrip works correctly when called with response_format directly
(verified via curl).

Set supportsStructuredOutputs:true on the createOpenAICompatible
factory so the AI SDK uses response_format json_schema mode. mana-llm
already routes that to Ollama's native format field thanks to the
companion fix in services/mana-llm/src/providers/ollama.py — verified
end-to-end with the MealAnalysisSchema and Gemma 3 4B.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-09 19:44:13 +02:00
..
drizzle/research feat(questions): deep-research module — mana-search + mana-llm pipeline 2026-04-08 22:15:35 +02:00
src fix(api): set supportsStructuredOutputs=true on mana-llm provider 2026-04-09 19:44:13 +02:00
Dockerfile fix(api/Dockerfile): copy @mana/shared-types into the build context 2026-04-09 17:25:23 +02:00
drizzle.config.ts feat(questions): deep-research module — mana-search + mana-llm pipeline 2026-04-08 22:15:35 +02:00
package.json feat(shared-types): add Zod schemas for AI structured outputs 2026-04-09 16:59:28 +02:00
tsconfig.json feat(api): create unified API server with first 3 modules 2026-04-02 21:12:15 +02:00