mirror of
https://github.com/Memo-2023/mana-monorepo.git
synced 2026-05-15 01:41:08 +02:00
generateObject() in the AI SDK falls back to a tool-call mode when the provider doesn't advertise structured-output support — and tool calling through Ollama isn't reliable enough that the schema-validation step passes. The response was failing with 'No object generated: response did not match schema' even though the underlying mana-llm + Ollama roundtrip works correctly when called with response_format directly (verified via curl). Set supportsStructuredOutputs:true on the createOpenAICompatible factory so the AI SDK uses response_format json_schema mode. mana-llm already routes that to Ollama's native format field thanks to the companion fix in services/mana-llm/src/providers/ollama.py — verified end-to-end with the MealAnalysisSchema and Gemma 3 4B. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com> |
||
|---|---|---|
| .. | ||
| drizzle/research | ||
| src | ||
| Dockerfile | ||
| drizzle.config.ts | ||
| package.json | ||
| tsconfig.json | ||