Migrate all LLM consumers from direct Ollama calls to centralized
mana-llm service with OpenAI-compatible API.
Migrated services:
- matrix-ollama-bot
- telegram-ollama-bot
- chat-backend
- telegram-project-doc-bot
New env vars: MANA_LLM_URL, LLM_MODEL, LLM_TIMEOUT
Replaces: OLLAMA_URL, OLLAMA_MODEL, OLLAMA_TIMEOUT
- Add keyword detection for German/English commands (hilfe, modelle, status)
- Send welcome message when users join the room
- Send bot introduction when invited to new rooms
- Add !pin command to pin help message
- Auto-pin help when joining new rooms
- Update help text with simpler command overview
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add !vision command to analyze images with vision models
- Add !vision:all command to compare all vision models
- Filter out specialized models (deepseek-r1) from !all comparison
- Add chatWithImage method to OllamaService for vision requests
- Switch Dockerfile from pnpm to npm for better compatibility
- Add .dockerignore and tsconfig.build.json
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add !all [question] command to query all models and compare responses
- Show response times for each model
- Update help text with new command and rename to Mana Chat
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Import LogLevel separately instead of LogService.LogLevel
- Change sendTyping to setTyping
- Use any type for event handler to avoid generic type issues
- Fix Buffer to Uint8Array conversion for OpenAI File API
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>