fix(llm): user-friendly messages + settings link for all LLM errors

Move getUserMessage() to the base LlmError class so every error type
gets a German explanation with a clickable settings deep-link:

- TierTooLowError: "Kein KI-Modell aktiviert. Mindestens X benötigt."
- ProviderBlockedError: "… hat die Anfrage blockiert (Inhaltsfilter)."
- BackendUnreachableError: "… ist nicht erreichbar."
- EdgeLoadFailedError: "Browser-Modell konnte nicht geladen werden."
- Generic fallback: also includes the settings link now

The companion engine now catches LlmError (base class) instead of
only NoTierAvailableError, covering all failure modes.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
Till JS 2026-04-17 15:13:48 +02:00
parent fa31fa0caf
commit 1cfd05939e
2 changed files with 28 additions and 6 deletions

View file

@ -11,7 +11,7 @@
* routes through text-completion).
*/
import { llmOrchestrator, NoTierAvailableError } from '@mana/shared-llm';
import { llmOrchestrator, LlmError } from '@mana/shared-llm';
import { isLocalLlmSupported, getLocalLlmStatus, loadLocalLlm } from '@mana/local-llm';
import { companionChatTask } from '$lib/llm-tasks/companion-chat';
import { generateContextDocument } from '$lib/data/projections/context-document';
@ -54,11 +54,11 @@ async function callLlm(messages: LlmMessage[], onToken?: (token: string) => void
});
return result.value.content;
} catch (err) {
if (err instanceof NoTierAvailableError) {
if (err instanceof LlmError) {
return err.getUserMessage();
}
const msg = err instanceof Error ? err.message : String(err);
return `LLM nicht verfügbar: ${msg}`;
return `LLM nicht verfügbar: ${msg}\n\n[KI-Einstellungen öffnen](/?app=settings#ai-options)`;
}
}