managarten/services/mana-llm
Till JS e757470cb0 feat(mana-llm): add OpenAI-style tools + tool_calls passthrough
Extends the chat-completions surface so callers can ask any provider
to call named functions and get structured tool_calls back. Wired
through all three provider adapters so the planner and companion can
switch off the fragile JSON-parsing pathway.

- Request: tools[], tool_choice, assistant tool_calls, tool-role
  messages with tool_call_id.
- Response: MessageResponse.tool_calls, Choice.finish_reason adds
  "tool_calls", DeltaContent streams tool_calls.
- Google provider: Tool(function_declarations=...) build, result
  normalised (args dict → JSON string), function_response parts on
  a user turn for tool-role messages.
- OpenAI-compat: 1:1 passthrough of the OpenAI spec.
- Ollama: /api/chat passthrough; model-level capability check via a
  TOOL_CAPABLE_OLLAMA_PATTERNS whitelist (llama3.1+, qwen2.5+,
  mistral, command-r, …) — unsupported models rejected rather than
  silently falling back to prose.
- Router: model_supports_tools() check upfront for both streaming
  and non-streaming paths; ProviderCapabilityError bubbles as 400.

No silent downgrade. Missing tool support = explicit error.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-20 15:22:48 +02:00
..
src feat(mana-llm): add OpenAI-style tools + tool_calls passthrough 2026-04-20 15:22:48 +02:00
tests feat(mana-llm): add central LLM abstraction service 2026-01-29 22:01:00 +01:00
.env.example chore(ai-services): adopt Windows GPU as source of truth for llm/stt/tts 2026-04-08 12:46:03 +02:00
.gitignore feat(mana-llm): add central LLM abstraction service 2026-01-29 22:01:00 +01:00
CLAUDE.md chore(matrix): final scrub of stale matrix references 2026-04-08 16:47:54 +02:00
docker-compose.dev.yml feat(mana-llm): add central LLM abstraction service 2026-01-29 22:01:00 +01:00
docker-compose.yml feat(mana-llm): add central LLM abstraction service 2026-01-29 22:01:00 +01:00
Dockerfile feat(mana-llm): add central LLM abstraction service 2026-01-29 22:01:00 +01:00
pyproject.toml feat(mana-llm): add Google Gemini fallback provider with auto-routing 2026-03-23 22:44:09 +01:00
requirements.txt fix(mana-llm): add google-genai to requirements.txt for Docker builds 2026-04-16 12:40:30 +02:00
service.pyw chore(ai-services): adopt Windows GPU as source of truth for llm/stt/tts 2026-04-08 12:46:03 +02:00
start.sh feat(llm-playground): add model comparison feature 2026-01-31 23:30:16 +01:00