mirror of
https://github.com/Memo-2023/mana-monorepo.git
synced 2026-05-14 20:01:09 +02:00
Phase 1-3 of BYOK support. Introduces a 5th LLM tier 'byok' that routes to user-provided API keys via direct browser fetches. shared-llm additions: - LlmTier extended with 'byok' (rank 3, between mana-server and cloud) - ByokBackend: LlmBackend implementation that delegates key lookup to an app-provided resolver callback, then dispatches to the right provider adapter - 4 provider adapters: - OpenAI (gpt-5, gpt-4o, o1 family) - Anthropic (Claude Opus/Sonnet/Haiku 4.6) with CORS header - Gemini (2.5 Pro/Flash) — REST API with different message format - Mistral — OpenAI-compatible, reuses shared openai-compat adapter - Pricing table for 20+ models with USD per 1M tokens - estimateCost() + formatCost() helpers Keys stay device-local (IndexedDB in next phase). Browser-direct fetches mean keys never touch Mana's server. Updates two existing tier maps (memoro DetailView, SourceBadge) to include the new tier. Planning doc at docs/architecture/BYOK_PLAN.md. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
23 lines
624 B
TypeScript
23 lines
624 B
TypeScript
import type { ByokProvider, ByokCallOptions } from './types';
|
|
import { callOpenAiCompat } from './openai-compat';
|
|
import type { GenerateResult } from '../../types';
|
|
|
|
export const mistralProvider: ByokProvider = {
|
|
id: 'mistral',
|
|
displayName: 'Mistral AI',
|
|
defaultModel: 'mistral-small-latest',
|
|
availableModels: [
|
|
'mistral-large-latest',
|
|
'mistral-small-latest',
|
|
'mistral-medium-latest',
|
|
'open-mistral-nemo',
|
|
'codestral-latest',
|
|
],
|
|
|
|
async call(opts: ByokCallOptions): Promise<GenerateResult> {
|
|
return callOpenAiCompat(
|
|
{ baseUrl: 'https://api.mistral.ai/v1', providerName: 'Mistral' },
|
|
opts
|
|
);
|
|
},
|
|
};
|