Commit graph

3 commits

Author SHA1 Message Date
Till JS
919cb4bf35 fix(local-llm): wrap @mlc-ai/web-llm in dynamic import for Docker builds
Move hasModelInCache to local-llm package with dynamic import wrapper
so the browser-only dependency doesn't break server-side builds.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-02 12:22:20 +02:00
Till JS
3bef29b9c8 feat(local-llm): add generate utilities and reactive Svelte status
Add generate.ts with streaming chat completions, JSON extraction, and
text classification helpers. Add status.svelte.ts with Svelte 5 runes
reactive wrapper for LLM engine state.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-02 11:57:50 +02:00
Till JS
ef538245d1 feat(local-llm): add client-side LLM inference package with WebLLM
New shared package for browser-based LLM inference using Qwen 2.5 1.5B
via WebLLM. Includes Svelte 5 reactive stores, engine management, and
type definitions for local AI features without server roundtrips.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-02 01:53:54 +02:00