managarten/services/mana-llm/src/models
Till JS 5520f1385e fix(mana-llm): add response_format to ChatCompletionRequest model
The first iteration of the Ollama response_format passthrough crashed
with 'ChatCompletionRequest object has no attribute response_format'
because the Pydantic request model didn't declare the field at all —
incoming response_format from OpenAI-compatible clients was being
silently dropped at the parsing layer before the provider could see it.

Fix: declare a typed ResponseFormat sub-model with the two OpenAI shapes
('json_object' and 'json_schema'), add it as an optional field on
ChatCompletionRequest, and let the Ollama provider read it directly
without defensive getattr fallbacks.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-09 18:50:54 +02:00
..
__init__.py chore: update dependencies and mana-llm improvements 2026-01-30 17:50:58 +01:00
requests.py fix(mana-llm): add response_format to ChatCompletionRequest model 2026-04-09 18:50:54 +02:00
responses.py feat(mana-llm): add central LLM abstraction service 2026-01-29 22:01:00 +01:00