managarten/apps/chat
Till-JS 6f51f1a24c feat(chat-backend): integrate Ollama for local LLM inference
- Add OllamaService for local model inference via Ollama API
- Update ChatService to route requests based on model provider
- Support both 'ollama' (local) and 'openrouter' (cloud) providers
- Add Gemma 3 4B as default model (free, runs on Mac Mini)
- Add SQL migration script for existing databases
- Update CLAUDE.md with Ollama configuration docs

Environment variables:
- OLLAMA_URL: Ollama server URL (default: http://localhost:11434)
- OLLAMA_TIMEOUT: Request timeout in ms (default: 120000)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-26 16:03:03 +01:00
..
apps feat(chat-backend): integrate Ollama for local LLM inference 2026-01-26 16:03:03 +01:00
packages/chat-types ️ fix: resolve all svelte-check a11y warnings across web apps 2025-12-15 19:09:01 +01:00
CLAUDE.md feat(chat-backend): integrate Ollama for local LLM inference 2026-01-26 16:03:03 +01:00
INTEGRATION_COMPLETE.md style: auto-format codebase with Prettier 2025-11-27 18:33:16 +01:00
MANA_CORE_AUTH_INTEGRATION.md style: auto-format codebase with Prettier 2025-11-27 18:33:16 +01:00
package.json 🔧 chore: enforce monorepo best practices with automated validation 2025-12-25 17:57:00 +01:00
TESTING_GUIDE.md 🔒 security(auth): migrate to EdDSA JWT and add automated monitoring 2025-12-18 21:42:47 +01:00