Migrate matrix-project-doc-bot from raw fetch to @manacore/shared-llm
and remove the unused openai npm package. The bot was already using
mana-llm and mana-stt (not OpenAI directly), but the code still had
raw fetch calls and the openai package installed.
Changes:
- generation.service.ts: raw fetch → llm.chat() via LlmClientService
- app.module.ts: add LlmModule.forRootAsync()
- Remove openai dependency (was unused in code)
- Update CLAUDE.md: document actual AI stack (mana-llm + mana-stt)
- Update TECH_STACK_INDEPENDENCE.md: mark Prio 1-3 as completed
- Prio 1: Picture App → mana-image-gen ✅
- Prio 2: Project Doc Bot → Ollama + mana-stt ✅
- Prio 3: All LLM calls via mana-llm ✅
- Self-hosted percentage: 75% → ~80%
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Comprehensive analysis of self-hosted vs cloud dependencies with
prioritized roadmap to reach ~90% self-hosting. Key findings: mana-image-gen
can replace Replicate, all LLM calls should route through mana-llm,
and backup strategy needs strengthening.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>