Phase 9k: Media-Upload via MinIO-Container

Eigener cards-minio-Container im docker-compose (9100/9101 — Plattform
auf 9000/9001 bleibt isoliert). cardsadmin/cardsadmin als Dev-Default,
prod via env-Vars (CARDS_S3_*).

apps/api/src/services/storage.ts — schmaler StorageService um den
minio-Client. ensureBucket() ist idempotent (auto-create beim ersten
Upload). removeObjectsByPrefix() implementiert den DSGVO-Bucket-Sweep,
weil die S3-API kein Cascade kennt.

Neue Tabelle media_files in pgSchema('cards'):
  id, user_id, object_key, mime_type, original_filename, size_bytes,
  kind, created_at — kein FK auf cards (ein File kann mehreren Karten
  gehören). objectKey-Format <userId>/<ulid>.<ext> für Bucket-Prefix-
  Sweep beim DSGVO-Delete. Legacy mediaRefs bleibt als Slot.

Neuer Router /api/v1/media:
  POST /upload   — multipart, 25 MiB Default-Limit, image/audio/video
                   only (415 sonst), schreibt media_files-Row + speichert
                   in MinIO unter <userId>/<ulid>.<ext>
  GET  /:id      — streamt aus MinIO mit Cache-Control: private,
                   immutable. Cross-User → 404 (nicht 403, anti-enumeration).
  GET  /         — listet alle eigenen Files

DSGVO-Pfade (Service-Key + /me/delete) räumen jetzt auch media_files
+ MinIO-Bucket-Prefix mit ab. Storage-Sweep ist non-fatal — DB ist erst
konsistent gelöscht, dead bytes wären die schlimmstmögliche Folge.

Anki-Import: parse.ts sanitizeAnkiHtml akzeptiert wieder eine
Filename→URL-Map (war in Phase 8c gedroppt). import.ts lädt vor den
Karten alle referenzierten Media-Files via uploadMedia() in MinIO,
sammelt URLs, ersetzt Anki-Filenames durch /api/v1/media/<id>-Pfade
in `<img>` (Markdown) und `[sound:…]` (HTML <audio>). 4-fache Worker-
Concurrency.

apps/web/src/lib/markdown.ts: DOMPurify lässt jetzt <audio>/<video>/
<source> mit src/controls/preload-Attributen durch — sonst würden die
Audio-Tags aus dem Anki-Import gestrippt.

i18n-Strings (DE/EN) auf Media-Stage erweitert: stage_media,
done_media, what_works_media, dropzone_hint, preview_media.
import.what_skipped_media wird zur Bestätigung dass Media seit
Sprint 9k mit übernommen wird.

Manueller E2E-Smoke gegen lokale MinIO (cards-minio :9100):
- 1×1-PNG hochgeladen → 201 mit ID + URL
- /api/v1/media/<id> streamt 200 image/png 69 bytes (file-Identifikation
  bestätigt)
- Cross-User → 404, ohne X-User-Id → 401, text/plain → 415

53 API-Tests grün (+4 neue media-Auth-Gate-Tests), 7 Web-Tests,
51 Domain-Tests, type-check + svelte-check 0 errors.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This commit is contained in:
Till JS 2026-05-08 18:42:56 +02:00
parent e7ae93dcf9
commit c9eb0a6f80
20 changed files with 886 additions and 78 deletions

View file

@ -19,8 +19,9 @@
},
"dependencies": {
"@cards/domain": "workspace:*",
"hono": "^4.6.0",
"drizzle-orm": "0.38",
"hono": "^4.6.0",
"minio": "^8.0.7",
"postgres": "^3.4.0",
"zod": "3",
"zod-to-json-schema": "^3.23.0"

View file

@ -22,8 +22,13 @@ export type {
export { tags } from './tags.ts';
export type { TagRow, TagInsert } from './tags.ts';
export { mediaRefs } from './media.ts';
export type { MediaRefRow, MediaRefInsert } from './media.ts';
export { mediaRefs, mediaFiles } from './media.ts';
export type {
MediaRefRow,
MediaRefInsert,
MediaFileRow,
MediaFileInsert,
} from './media.ts';
export { importJobs } from './imports.ts';
export type { ImportJobRow, ImportJobInsert } from './imports.ts';

View file

@ -4,9 +4,42 @@ import { cardsSchema } from './_schema.ts';
import { cards } from './cards.ts';
/**
* Media-Verweise auf Object-IDs in mana-media. Die eigentlichen Files
* (Bilder, Audio, Video) liegen in MinIO via mana-media; diese Tabelle
* hält nur den Verweis + Sortier-Order pro Karte.
* Media-Files: Bilder, Audio, Video, die in MinIO unter dem
* `objectKey` liegen und von Karten via cards.media_refs[]
* referenziert werden.
*
* Bewusst ohne FK auf eine konkrete Karte: ein File kann von
* mehreren Karten referenziert werden (z.B. ein Bild für Front
* und Back). Lifecycle-Cleanup per Cron oder DSGVO-Delete.
*
* objectKey-Format: `<userId>/<ulid>.<ext>` UserId-Präfix
* vereinfacht den DSGVO-Delete (Bucket-Prefix-Sweep).
*/
export const mediaFiles = cardsSchema.table(
'media_files',
{
id: text('id').primaryKey(),
userId: text('user_id').notNull(),
objectKey: text('object_key').notNull(),
mimeType: text('mime_type').notNull(),
originalFilename: text('original_filename'),
sizeBytes: integer('size_bytes').notNull(),
kind: text('kind', { enum: ['image', 'audio', 'video', 'other'] }).notNull(),
createdAt: timestamp('created_at', { withTimezone: true, mode: 'date' })
.notNull()
.defaultNow(),
},
(t) => ({
userIdx: index('media_files_user_idx').on(t.userId),
})
);
export type MediaFileRow = typeof mediaFiles.$inferSelect;
export type MediaFileInsert = typeof mediaFiles.$inferInsert;
/**
* Legacy: media_refs aus Phase 1, Vor-Sprint-15. Bewusst behalten als
* Sortier-Layer-Slot für später (mana-media-Konvergenz). Aktuell leer.
*/
export const mediaRefs = cardsSchema.table(
'media_refs',

View file

@ -11,6 +11,7 @@ import { toolsRouter } from './routes/tools.ts';
import { searchRouter } from './routes/search.ts';
import { dsgvoRouter } from './routes/dsgvo.ts';
import { meRouter } from './routes/me.ts';
import { mediaRouter } from './routes/media.ts';
const app = new Hono();
@ -41,6 +42,7 @@ app.route('/api/v1/tools', toolsRouter());
app.route('/api/v1/search', searchRouter());
app.route('/api/v1/dsgvo', dsgvoRouter());
app.route('/api/v1/me', meRouter());
app.route('/api/v1/media', mediaRouter());
app.get('/', (c) =>
c.json({

View file

@ -6,12 +6,14 @@ import {
cards,
decks,
importJobs,
mediaFiles,
mediaRefs,
reviews,
studySessions,
tags,
} from '../db/schema/index.ts';
import { serviceKeyAuth } from '../middleware/service-key.ts';
import { getStorage } from '../services/storage.ts';
export type DsgvoDeps = { db?: CardsDb };
@ -21,16 +23,25 @@ export type DsgvoDeps = { db?: CardsDb };
* Export aus /api/v1/me.
*/
export async function buildUserExport(db: CardsDb, userId: string) {
const [decksRows, cardsRows, reviewsRows, sessionsRows, tagsRows, mediaRows, importsRows] =
await Promise.all([
db.select().from(decks).where(eq(decks.userId, userId)),
db.select().from(cards).where(eq(cards.userId, userId)),
db.select().from(reviews).where(eq(reviews.userId, userId)),
db.select().from(studySessions).where(eq(studySessions.userId, userId)),
db.select().from(tags).where(eq(tags.userId, userId)),
db.select().from(mediaRefs).where(eq(mediaRefs.userId, userId)),
db.select().from(importJobs).where(eq(importJobs.userId, userId)),
]);
const [
decksRows,
cardsRows,
reviewsRows,
sessionsRows,
tagsRows,
mediaRefRows,
mediaFileRows,
importsRows,
] = await Promise.all([
db.select().from(decks).where(eq(decks.userId, userId)),
db.select().from(cards).where(eq(cards.userId, userId)),
db.select().from(reviews).where(eq(reviews.userId, userId)),
db.select().from(studySessions).where(eq(studySessions.userId, userId)),
db.select().from(tags).where(eq(tags.userId, userId)),
db.select().from(mediaRefs).where(eq(mediaRefs.userId, userId)),
db.select().from(mediaFiles).where(eq(mediaFiles.userId, userId)),
db.select().from(importJobs).where(eq(importJobs.userId, userId)),
]);
return {
user_id: userId,
@ -59,10 +70,14 @@ export async function buildUserExport(db: CardsDb, userId: string) {
finishedAt: s.finishedAt ? s.finishedAt.toISOString() : null,
})),
tags: tagsRows.map((t) => ({ ...t, createdAt: t.createdAt.toISOString() })),
media_refs: mediaRows.map((m) => ({
media_refs: mediaRefRows.map((m) => ({
...m,
createdAt: m.createdAt.toISOString(),
})),
media_files: mediaFileRows.map((f) => ({
...f,
createdAt: f.createdAt.toISOString(),
})),
import_jobs: importsRows.map((j) => ({
...j,
createdAt: j.createdAt.toISOString(),
@ -104,7 +119,9 @@ export function dsgvoRouter(deps: DsgvoDeps = {}): Hono {
* cards card_tags
* decks tags
* decks study_sessions
* Verbleibend: import_jobs (eigene Tabelle ohne FK) wird separat gelöscht.
* Verbleibend: import_jobs + media_files (eigene Tabellen ohne FK)
* werden separat gelöscht. MinIO-Objects werden per Bucket-Prefix-
* Sweep entfernt (objectKey-Format `<userId>/<ulid>.<ext>`).
*/
r.post('/delete', async (c) => {
const body = await c.req.json().catch(() => null);
@ -112,14 +129,33 @@ export function dsgvoRouter(deps: DsgvoDeps = {}): Hono {
if (!userId) return c.json({ error: 'missing_user_id' }, 400);
const db = dbOf();
const [deletedDecks, deletedImports] = await db.transaction(async (tx) => {
const dd = await tx.delete(decks).where(eq(decks.userId, userId)).returning({ id: decks.id });
const di = await tx
.delete(importJobs)
.where(eq(importJobs.userId, userId))
.returning({ id: importJobs.id });
return [dd, di];
});
const [deletedDecks, deletedImports, deletedMediaFiles] = await db.transaction(
async (tx) => {
const dd = await tx
.delete(decks)
.where(eq(decks.userId, userId))
.returning({ id: decks.id });
const di = await tx
.delete(importJobs)
.where(eq(importJobs.userId, userId))
.returning({ id: importJobs.id });
const dm = await tx
.delete(mediaFiles)
.where(eq(mediaFiles.userId, userId))
.returning({ id: mediaFiles.id });
return [dd, di, dm];
}
);
// MinIO-Bucket-Sweep nach DB-Cleanup. Wenn der Storage-Sweep
// scheitert, ist das nicht-fatal — die DB ist schon konsistent
// gelöscht und Storage-Files ohne DB-Eintrag sind tote Bytes.
let storageObjectsDeleted = 0;
try {
storageObjectsDeleted = await getStorage().removeObjectsByPrefix(`${userId}/`);
} catch (err) {
console.warn('[dsgvo/delete] storage sweep failed:', err);
}
return c.json({
deleted: true,
@ -127,6 +163,8 @@ export function dsgvoRouter(deps: DsgvoDeps = {}): Hono {
counts: {
decks: deletedDecks.length,
import_jobs: deletedImports.length,
media_files: deletedMediaFiles.length,
storage_objects: storageObjectsDeleted,
},
});
});

View file

@ -2,8 +2,9 @@ import { and, eq, gte, isNotNull, lte, sql } from 'drizzle-orm';
import { Hono } from 'hono';
import { getDb, type CardsDb } from '../db/connection.ts';
import { cards, decks, importJobs, reviews } from '../db/schema/index.ts';
import { cards, decks, importJobs, mediaFiles, reviews } from '../db/schema/index.ts';
import { authMiddleware, type AuthVars } from '../middleware/auth.ts';
import { getStorage } from '../services/storage.ts';
import { buildUserExport } from './dsgvo.ts';
export type MeDeps = { db?: CardsDb };
@ -125,14 +126,30 @@ export function meRouter(deps: MeDeps = {}): Hono<{ Variables: AuthVars }> {
r.post('/delete', async (c) => {
const userId = c.get('userId');
const db = dbOf();
const [deletedDecks, deletedImports] = await db.transaction(async (tx) => {
const dd = await tx.delete(decks).where(eq(decks.userId, userId)).returning({ id: decks.id });
const di = await tx
.delete(importJobs)
.where(eq(importJobs.userId, userId))
.returning({ id: importJobs.id });
return [dd, di];
});
const [deletedDecks, deletedImports, deletedMediaFiles] = await db.transaction(
async (tx) => {
const dd = await tx
.delete(decks)
.where(eq(decks.userId, userId))
.returning({ id: decks.id });
const di = await tx
.delete(importJobs)
.where(eq(importJobs.userId, userId))
.returning({ id: importJobs.id });
const dm = await tx
.delete(mediaFiles)
.where(eq(mediaFiles.userId, userId))
.returning({ id: mediaFiles.id });
return [dd, di, dm];
}
);
let storageObjectsDeleted = 0;
try {
storageObjectsDeleted = await getStorage().removeObjectsByPrefix(`${userId}/`);
} catch (err) {
console.warn('[me/delete] storage sweep failed:', err);
}
return c.json({
deleted: true,
@ -140,6 +157,8 @@ export function meRouter(deps: MeDeps = {}): Hono<{ Variables: AuthVars }> {
counts: {
decks: deletedDecks.length,
import_jobs: deletedImports.length,
media_files: deletedMediaFiles.length,
storage_objects: storageObjectsDeleted,
},
});
});

View file

@ -0,0 +1,156 @@
import { and, eq } from 'drizzle-orm';
import { Hono } from 'hono';
import { getDb, type CardsDb } from '../db/connection.ts';
import { mediaFiles, type MediaFileRow } from '../db/schema/index.ts';
import { authMiddleware, type AuthVars } from '../middleware/auth.ts';
import { ulid } from '../lib/ulid.ts';
import { getStorage, type StorageService } from '../services/storage.ts';
export type MediaDeps = { db?: CardsDb; storage?: StorageService };
const MAX_BYTES = Number(process.env.CARDS_MEDIA_MAX_BYTES ?? 25 * 1024 * 1024); // 25 MiB
const ALLOWED_PREFIXES = ['image/', 'audio/', 'video/'];
function kindFor(mime: string): MediaFileRow['kind'] {
if (mime.startsWith('image/')) return 'image';
if (mime.startsWith('audio/')) return 'audio';
if (mime.startsWith('video/')) return 'video';
return 'other';
}
function extFor(mime: string, fallback?: string): string {
const map: Record<string, string> = {
'image/jpeg': 'jpg',
'image/png': 'png',
'image/gif': 'gif',
'image/webp': 'webp',
'image/svg+xml': 'svg',
'audio/mpeg': 'mp3',
'audio/ogg': 'ogg',
'audio/wav': 'wav',
'audio/mp4': 'm4a',
'video/mp4': 'mp4',
'video/webm': 'webm',
};
if (map[mime]) return map[mime];
if (fallback) {
const dot = fallback.lastIndexOf('.');
if (dot > 0 && dot < fallback.length - 1) return fallback.slice(dot + 1).toLowerCase();
}
return 'bin';
}
export function mediaRouter(deps: MediaDeps = {}): Hono<{ Variables: AuthVars }> {
const r = new Hono<{ Variables: AuthVars }>();
const dbOf = () => deps.db ?? getDb();
const storageOf = () => deps.storage ?? getStorage();
r.use('*', authMiddleware);
/**
* Multipart-Upload eines einzelnen Files. Bilder/Audio/Video; alles
* andere wird mit 415 abgelehnt. Limit per env (Default 25 MiB).
*
* Antwort:
* { id, url, mime_type, kind, size_bytes, original_filename }
*
* `url` ist ein relativer App-Pfad (`/api/v1/media/<id>`), den
* Frontend + Anki-Importer in `<img src=...>` einsetzen können.
* Public absolute URL kommt erst mit Phase 10 + DNS dazu.
*/
r.post('/upload', async (c) => {
const userId = c.get('userId');
const form = await c.req.formData().catch(() => null);
if (!form) return c.json({ error: 'expected_multipart' }, 400);
const file = form.get('file');
if (!(file instanceof File)) return c.json({ error: 'missing_file_field' }, 400);
const mime = file.type || 'application/octet-stream';
if (!ALLOWED_PREFIXES.some((p) => mime.startsWith(p))) {
return c.json({ error: 'unsupported_media_type', mime_type: mime }, 415);
}
if (file.size > MAX_BYTES) {
return c.json({ error: 'too_large', max_bytes: MAX_BYTES, got: file.size }, 413);
}
const id = ulid();
const ext = extFor(mime, file.name);
const objectKey = `${userId}/${id}.${ext}`;
const buf = new Uint8Array(await file.arrayBuffer());
await storageOf().putObject(objectKey, buf, mime);
const [row] = await dbOf()
.insert(mediaFiles)
.values({
id,
userId,
objectKey,
mimeType: mime,
originalFilename: file.name || null,
sizeBytes: buf.byteLength,
kind: kindFor(mime),
createdAt: new Date(),
})
.returning();
return c.json(
{
id: row.id,
url: `/api/v1/media/${row.id}`,
mime_type: row.mimeType,
kind: row.kind,
size_bytes: row.sizeBytes,
original_filename: row.originalFilename,
},
201
);
});
/**
* Streamt ein Media-File via MinIO-getObject. User-gated fremde
* Files sind 404 (nicht 403, damit IDs nicht enumerierbar sind).
*/
r.get('/:id', async (c) => {
const userId = c.get('userId');
const id = c.req.param('id');
const [row] = await dbOf()
.select()
.from(mediaFiles)
.where(and(eq(mediaFiles.id, id), eq(mediaFiles.userId, userId)))
.limit(1);
if (!row) return c.json({ error: 'not_found' }, 404);
const stream = await storageOf().getObjectStream(row.objectKey);
c.header('Content-Type', row.mimeType);
c.header('Content-Length', String(row.sizeBytes));
c.header('Cache-Control', 'private, max-age=31536000, immutable');
return new Response(stream as unknown as ReadableStream, {
status: 200,
headers: c.res.headers,
});
});
/** Listet alle Media-Files des Users — nützlich fürs UI später. */
r.get('/', async (c) => {
const userId = c.get('userId');
const rows = await dbOf().select().from(mediaFiles).where(eq(mediaFiles.userId, userId));
return c.json({
files: rows.map((r) => ({
id: r.id,
url: `/api/v1/media/${r.id}`,
mime_type: r.mimeType,
kind: r.kind,
size_bytes: r.sizeBytes,
original_filename: r.originalFilename,
created_at: r.createdAt.toISOString(),
})),
total: rows.length,
});
});
return r;
}

View file

@ -0,0 +1,92 @@
/**
* Object-Storage über MinIO (S3-API-kompatibel).
*
* Lokal: Container `cards-minio` (siehe infrastructure/docker-compose.yml)
* auf 9100/9101 Plattform-MinIO bleibt auf 9000/9001 ungestört.
*
* Produktiv (Phase 10): entweder eigener MinIO auf dem Mac Mini mit
* separatem Bucket, oder gegen das Plattform-MinIO mit eigenem Bucket
* `cards-media`. Konfiguration via env, kein Code-Pfad muss umgebogen
* werden.
*/
import * as Minio from 'minio';
let cached: StorageService | null = null;
export class StorageService {
readonly client: Minio.Client;
readonly bucket: string;
private bucketReady = false;
constructor() {
this.client = new Minio.Client({
endPoint: process.env.CARDS_S3_ENDPOINT ?? 'localhost',
port: Number(process.env.CARDS_S3_PORT ?? 9100),
useSSL: process.env.CARDS_S3_USE_SSL === 'true',
accessKey: process.env.CARDS_S3_ACCESS_KEY ?? 'cardsadmin',
secretKey: process.env.CARDS_S3_SECRET_KEY ?? 'cardsadmin',
});
this.bucket = process.env.CARDS_S3_BUCKET ?? 'cards-media';
}
/** Idempotenter Bucket-Init. Wird einmal pro Process-Lifetime gerufen. */
async ensureBucket(): Promise<void> {
if (this.bucketReady) return;
const exists = await this.client.bucketExists(this.bucket).catch(() => false);
if (!exists) {
await this.client.makeBucket(this.bucket);
}
this.bucketReady = true;
}
async putObject(
key: string,
body: Buffer | Uint8Array,
contentType: string
): Promise<void> {
await this.ensureBucket();
await this.client.putObject(this.bucket, key, Buffer.from(body), body.byteLength, {
'Content-Type': contentType,
});
}
async getObjectStream(key: string): Promise<NodeJS.ReadableStream> {
await this.ensureBucket();
return this.client.getObject(this.bucket, key);
}
async statObject(key: string): Promise<{ size: number; contentType: string }> {
await this.ensureBucket();
const stat = await this.client.statObject(this.bucket, key);
return {
size: stat.size,
contentType: stat.metaData?.['content-type'] ?? 'application/octet-stream',
};
}
async removeObject(key: string): Promise<void> {
await this.ensureBucket();
await this.client.removeObject(this.bucket, key);
}
async removeObjectsByPrefix(prefix: string): Promise<number> {
await this.ensureBucket();
const objectsStream = this.client.listObjectsV2(this.bucket, prefix, true);
const keys: string[] = [];
for await (const obj of objectsStream) {
if (obj.name) keys.push(obj.name);
}
if (keys.length > 0) await this.client.removeObjects(this.bucket, keys);
return keys.length;
}
}
export function getStorage(): StorageService {
if (!cached) cached = new StorageService();
return cached;
}
export function resetStorageForTests(): void {
cached = null;
}

View file

@ -0,0 +1,53 @@
import { describe, it, expect } from 'vitest';
import { Hono } from 'hono';
import { mediaRouter } from '../src/routes/media.ts';
import type { CardsDb } from '../src/db/connection.ts';
/**
* Auth-Gate-Tests für die Media-Routen ohne echte DB. Wir prüfen, dass
* der authMiddleware-Pfad ehrt und Validation-Errors konsistent sind.
* Ein echter MinIO-Roundtrip bleibt dem manuellen E2E-Smoke vorbehalten,
* weil sql.js + JSZip + MinIO-SDK in Vitest zu viel Mock-Overhead wäre.
*/
function buildApp() {
const app = new Hono();
const stub = {} as CardsDb;
app.route('/api/v1/media', mediaRouter({ db: stub }));
return { app };
}
describe('mediaRouter — auth-gate', () => {
it('GET ohne X-User-Id ist 401', async () => {
const { app } = buildApp();
const res = await app.request('/api/v1/media');
expect(res.status).toBe(401);
});
it('GET /:id ohne X-User-Id ist 401', async () => {
const { app } = buildApp();
const res = await app.request('/api/v1/media/abc');
expect(res.status).toBe(401);
});
it('POST /upload ohne X-User-Id ist 401', async () => {
const { app } = buildApp();
const res = await app.request('/api/v1/media/upload', {
method: 'POST',
});
expect(res.status).toBe(401);
});
});
describe('mediaRouter — Input-Validation', () => {
it('POST /upload ohne multipart-Body ist 400', async () => {
const { app } = buildApp();
const res = await app.request('/api/v1/media/upload', {
method: 'POST',
headers: { 'X-User-Id': 'u-1' },
});
expect(res.status).toBe(400);
const body = (await res.json()) as { error: string };
expect(body.error).toBe('expected_multipart');
});
});

View file

@ -6,37 +6,123 @@
* (Anki-`::` zu ` / ` flacht die Hierarchie aus, wie im Original).
* Karten werden mit sanitisiertem Markdown angelegt.
*
* Phase-8-MVP: Bilder + Audio werden gedroppt (siehe parse.ts
* `sanitizeAnkiHtml`). Ein späterer Media-Pfad ist additiv.
* Phase 9k: Media-Upload via MinIO. Bilder + Audio werden vor den
* Karten in den Cards-Bucket geladen, der Sanitize-Pfad ersetzt
* Anki-Filenames durch echte Media-URLs (`/api/v1/media/<id>`).
*
* Phase-9j-Re-Import-Dedupe: Vor dem Insert wird der content_hash der
* Karte berechnet (gleiche Funktion wie der Server) und gegen die
* existierende Hash-Liste des Users geprüft. Duplikate werden gezählt
* und übersprungen Re-Imports bringen also keine doppelten Karten
* mehr ins Deck. Decks werden nicht dedupliziert (gewollt: zwei
* .apkg-Files mit identischen Decknamen sollen sich nicht
* versehentlich zusammenführen).
* Phase 9j Re-Import-Dedupe: content_hash-Set wird vor dem Loop
* geladen, Duplikate werden gezählt und übersprungen.
*/
import JSZip from 'jszip';
import { cardContentHash } from '@cards/domain';
import { createDeck } from '$lib/api/decks.ts';
import { createCard, listCardHashes } from '$lib/api/cards.ts';
import { uploadMedia } from '$lib/api/media.ts';
import { sanitizeAnkiHtml, type ParsedAnki } from './parse.ts';
export interface ImportResult {
decksCreated: number;
cardsCreated: number;
cardsSkippedDuplicate: number;
mediaUploaded: number;
mediaFailed: number;
failed: number;
failures: string[];
}
export interface ImportProgress {
stage: 'decks' | 'cards' | 'done';
stage: 'media' | 'decks' | 'cards' | 'done';
current: number;
total: number;
}
const MEDIA_CONCURRENCY = 4;
const IMG_RE = /<img\b[^>]*\bsrc=["']([^"']+)["']/gi;
const SOUND_RE = /\[sound:([^\]]+)\]/g;
function collectMediaRefs(parsed: ParsedAnki): Set<string> {
const refs = new Set<string>();
for (const card of parsed.cards) {
for (const value of Object.values(card.fields)) {
let m: RegExpExecArray | null;
IMG_RE.lastIndex = 0;
while ((m = IMG_RE.exec(value))) refs.add(m[1]);
SOUND_RE.lastIndex = 0;
while ((m = SOUND_RE.exec(value))) refs.add(m[1]);
}
}
return refs;
}
function guessMime(filename: string): string {
const ext = filename.split('.').pop()?.toLowerCase() ?? '';
const map: Record<string, string> = {
jpg: 'image/jpeg',
jpeg: 'image/jpeg',
png: 'image/png',
gif: 'image/gif',
webp: 'image/webp',
svg: 'image/svg+xml',
mp3: 'audio/mpeg',
ogg: 'audio/ogg',
oga: 'audio/ogg',
wav: 'audio/wav',
m4a: 'audio/mp4',
mp4: 'video/mp4',
webm: 'video/webm',
};
return map[ext] ?? 'application/octet-stream';
}
async function uploadAllMedia(
parsed: ParsedAnki,
onProgress?: (current: number, total: number) => void
): Promise<{ urlByFilename: Map<string, string>; uploaded: number; failed: number }> {
const referenced = [...collectMediaRefs(parsed)].filter((f) => parsed.mediaByFilename.has(f));
const urlByFilename = new Map<string, string>();
let uploaded = 0;
let failed = 0;
let done = 0;
if (referenced.length === 0) {
onProgress?.(0, 0);
return { urlByFilename, uploaded, failed };
}
let nextIdx = 0;
async function worker() {
while (true) {
const idx = nextIdx++;
if (idx >= referenced.length) return;
const filename = referenced[idx];
const entry = parsed.mediaByFilename.get(filename);
if (!entry) {
failed++;
done++;
onProgress?.(done, referenced.length);
continue;
}
try {
const blob = await (entry as JSZip.JSZipObject).async('blob');
const file = new File([blob], filename, { type: guessMime(filename) });
const result = await uploadMedia(file);
urlByFilename.set(filename, result.url);
uploaded++;
} catch (e) {
console.warn(`[anki-import] media upload failed for ${filename}:`, e);
failed++;
}
done++;
onProgress?.(done, referenced.length);
}
}
await Promise.all(Array.from({ length: MEDIA_CONCURRENCY }, () => worker()));
return { urlByFilename, uploaded, failed };
}
export async function importParsedAnki(
parsed: ParsedAnki,
opts: { onProgress?: (p: ImportProgress) => void } = {}
@ -45,22 +131,32 @@ export async function importParsedAnki(
decksCreated: 0,
cardsCreated: 0,
cardsSkippedDuplicate: 0,
mediaUploaded: 0,
mediaFailed: 0,
failed: 0,
failures: [],
};
// Vor dem Insert die Hash-Liste des Users laden — wenn der Endpoint
// fehlschlägt (z.B. älterer Server vor Phase 9j), fallen wir
// stillschweigend auf "kein Dedupe" zurück.
// Hash-Set vor dem Loop laden (Phase 9j-Dedupe).
const existingHashes = new Set<string>();
try {
const r = await listCardHashes();
for (const h of r.hashes) existingHashes.add(h);
} catch {
// Dedupe bleibt aus — Karten werden eingefügt wie zuvor.
// Dedupe bleibt aus (älterer Server o.ä.).
}
// 1) Decks — Anki "::"-Hierarchie zu " / "-Strings flach machen.
// 1) Media — vor den Karten uploaden, damit der Sanitize-Pfad echte
// URLs einsetzen kann. Files, die nicht im Anki-Manifest stehen,
// werden gedroppt; Upload-Fehler werden gezählt + im Card-Field
// gedroppt (statt 404-URL).
const { urlByFilename, uploaded, failed } = await uploadAllMedia(parsed, (current, total) => {
opts.onProgress?.({ stage: 'media', current, total });
});
result.mediaUploaded = uploaded;
result.mediaFailed = failed;
// 2) Decks — Anki "::"-Hierarchie zu " / "-Strings flach machen.
const ankiIdToDeckId = new Map<string, string>();
let deckIdx = 0;
for (const ankiDeck of parsed.decks) {
@ -76,7 +172,6 @@ export async function importParsedAnki(
}
}
// Fallback-Deck für Karten ohne explizit referenziertes Anki-Deck.
let fallbackDeckId: string | null = null;
const ensureFallbackDeck = async (): Promise<string | null> => {
if (fallbackDeckId) return fallbackDeckId;
@ -91,14 +186,14 @@ export async function importParsedAnki(
}
};
// 2) Cards — Felder sanitizen, content_hash prüfen, einfügen.
// 3) Cards — sanitize mit URL-Map, content_hash-Dedupe, Insert.
for (let i = 0; i < parsed.cards.length; i++) {
opts.onProgress?.({ stage: 'cards', current: i, total: parsed.cards.length });
const card = parsed.cards[i];
const cleanFields: Record<string, string> = {};
for (const [key, value] of Object.entries(card.fields)) {
cleanFields[key] = sanitizeAnkiHtml(value);
cleanFields[key] = sanitizeAnkiHtml(value, urlByFilename);
}
const hash = await cardContentHash({ type: card.type, fields: cleanFields });
@ -124,8 +219,6 @@ export async function importParsedAnki(
fields: cleanFields,
});
result.cardsCreated++;
// Hash sofort merken — derselbe Import könnte zwei identische
// Karten enthalten (Anki-Drift), zweite würde sonst auch rein.
existingHashes.add(hash);
} catch (e) {
result.failed++;

View file

@ -213,23 +213,41 @@ function mapNoteToCard(
/**
* Convert Anki's HTML / image / sound markup to plain text + Markdown.
*
* Phase-8-MVP: Bilder + Audio werden ersatzlos gedroppt (Option A).
* Ein späterer Media-Pfad (lokaler Cards-Upload-Endpunkt oder mana-media
* via Phase 2 Auth-Föderation) kann hier eine FilenameURL-Map einsetzen,
* die dann zu `<img>` / `<audio>`-Tags expandiert.
* `mediaUrlByFilename` mapt den Anki-Filename (wie er im Card-Field
* referenziert ist) auf eine echte App-URL. Was nicht in der Map ist,
* wird gedroppt das passiert z.B. wenn der Media-Upload für diese
* Datei fehlschlägt.
*
* `<img>` Markdown `![alt](url)`. `[sound:foo.mp3]` HTML
* `<audio src="url" controls>` (Markdown hat keine native Audio-Syntax,
* aber unser Renderer sanitized HTML mit DOMPurify und lässt `<audio>`
* durch).
*/
export function sanitizeAnkiHtml(html: string): string {
// Bilder + Audio-Refs vollständig entfernen.
const imgStripped = html.replace(/<img\b[^>]*>/gi, '');
const soundStripped = imgStripped.replace(/\[sound:[^\]]+\]/g, '');
export function sanitizeAnkiHtml(
html: string,
mediaUrlByFilename: Map<string, string> = new Map()
): string {
const imgReplaced = html.replace(
/<img\b[^>]*\bsrc=["']([^"']+)["'][^>]*>/gi,
(_, src: string) => {
const url = mediaUrlByFilename.get(src);
return url ? `![${src}](${url})` : '';
}
);
const soundReplaced = imgReplaced.replace(/\[sound:([^\]]+)\]/g, (_, name: string) => {
const url = mediaUrlByFilename.get(name);
return url ? `<audio controls preload="metadata" src="${url}"></audio>` : '';
});
return soundStripped
return soundReplaced
.replace(/<br\s*\/?>/gi, '\n')
.replace(/<\/?(?:b|strong)>/gi, '**')
.replace(/<\/?(?:i|em)>/gi, '*')
.replace(/<\/?p>/gi, '\n')
.replace(/<\/?div>/gi, '\n')
.replace(/<[^>]+>/gi, '')
// Drop remaining HTML tags except the ones we just emitted
// (audio/video/source) — die müssen den Renderer überleben.
.replace(/<(?!\/?(?:audio|video|source)\b)[^>]+>/gi, '')
.replace(/&nbsp;/g, ' ')
.replace(/&amp;/g, '&')
.replace(/&lt;/g, '<')

View file

@ -0,0 +1,42 @@
import { API_BASE, ApiError } from './client.ts';
import { devUser } from '$lib/auth/dev-stub.svelte.ts';
export interface MediaUploadResult {
id: string;
url: string;
mime_type: string;
kind: 'image' | 'audio' | 'video' | 'other';
size_bytes: number;
original_filename: string | null;
}
/**
* Lädt ein einzelnes File via multipart/form-data hoch. Anders als der
* Standard-API-Helper geht das nicht über `Content-Type: application/json`,
* deshalb hier ein eigener Pfad mit FormData + manuellem fetch.
*/
export async function uploadMedia(file: File | Blob, filename?: string): Promise<MediaUploadResult> {
const form = new FormData();
const wrapped = file instanceof File ? file : new File([file], filename ?? 'upload.bin');
form.append('file', wrapped);
const headers: Record<string, string> = {};
if (devUser.id) headers['X-User-Id'] = devUser.id;
const res = await fetch(`${API_BASE}/api/v1/media/upload`, {
method: 'POST',
body: form,
headers,
});
if (!res.ok) {
let body: unknown = null;
try {
body = await res.json();
} catch {
body = await res.text().catch(() => null);
}
throw new ApiError(res.status, body, `media upload failed: ${res.status}`);
}
return res.json();
}

View file

@ -172,7 +172,9 @@
</div>
{:else if stage === 'importing'}
<div class="py-6 text-center text-sm text-[var(--color-muted)]" aria-live="polite">
{#if progress.stage === 'decks'}
{#if progress.stage === 'media'}
{t('import.stage_media', { current: progress.current, total: progress.total })}
{:else if progress.stage === 'decks'}
{t('import.stage_decks', { current: progress.current, total: progress.total })}
{:else if progress.stage === 'cards'}
{t('import.stage_cards', { current: progress.current, total: progress.total })}
@ -204,6 +206,14 @@
{t('import.done_dupes', { n: result.cardsSkippedDuplicate })}
</div>
{/if}
{#if result.mediaUploaded > 0 || result.mediaFailed > 0}
<div class="text-[var(--color-muted)]">
{t('import.done_media', {
uploaded: result.mediaUploaded,
failed: result.mediaFailed,
})}
</div>
{/if}
{#if result.failed > 0}
<details class="text-[var(--color-danger)]">
<summary class="cursor-pointer">{t('import.done_failures', { n: result.failed })}</summary>

View file

@ -132,13 +132,14 @@ export const de: TranslationNode = {
what_works_decks: 'Decks (Anki-Hierarchie Foo::Bar wird zu Foo / Bar).',
what_works_basic: 'Basic + Basic-Reverse: Front/Back direkt.',
what_works_cloze: 'Cloze: {{c1::…}} wird mit Sub-Index pro Cluster angelegt.',
what_works_media: 'Bilder + Audio (eingebettet als Markdown bzw. <audio>-Tag).',
what_skipped_title: 'Was nicht übernommen wird',
what_skipped_media: 'Bilder + Audio (kommen mit der Plattform-Anbindung in einer späteren Phase).',
what_skipped_media: '— (Bilder + Audio werden seit Phase 9k mit übernommen, siehe oben)',
what_skipped_history: 'FSRS-Lernverlauf (Anki-Reviews werden bewusst neu aufgesetzt).',
what_skipped_addons: 'Add-on-spezifische Card-Types (image-occlusion etc.).',
anki_label: 'Aus Anki importieren',
dropzone: '📦 .apkg-Datei hier ablegen oder klicken',
dropzone_hint: 'Basic, Basic + Reverse, Cloze · Bilder + Audio werden in dieser Phase nicht übernommen.',
dropzone_hint: 'Basic, Basic + Reverse, Cloze · Bilder + Audio werden mit übernommen (Limit 25 MB pro Datei).',
parsing: 'Lese {file}…',
preview_found: 'Gefunden in',
preview_decks_one: '1 Deck',
@ -146,17 +147,19 @@ export const de: TranslationNode = {
preview_cards_one: '1 Karte',
preview_cards: '{n} Karten',
preview_breakdown: '({basic} basic, {basic_reverse} basic-reverse, {cloze} cloze)',
preview_media: '{n} Medien (werden in dieser Phase NICHT übernommen)',
preview_media: '{n} Medien werden mitgeladen',
preview_skipped: '{n} übersprungen (unbekannter Typ)',
preview_warnings: 'Hinweise ({n})',
cancel: 'Abbrechen',
import_now: 'Importieren',
stage_media: 'Lade Medien hoch · {current} / {total}',
stage_decks: 'Lege Decks an · {current} / {total}',
stage_cards: 'Importiere Karten · {current} / {total}',
stage_done: 'Fertig.',
done_summary_one: '✓ {cards} Karten in 1 Deck angelegt.',
done_summary: '✓ {cards} Karten in {decks} Decks angelegt.',
done_dupes: '{n} Duplikate übersprungen (gleicher Inhalt schon vorhanden).',
done_media: '{uploaded} Medien geladen, {failed} fehlgeschlagen.',
done_failures: '{n} Fehler',
done_more: 'Weitere Datei',
error_label: 'Fehler: {msg}',

View file

@ -129,13 +129,14 @@ export const en: TranslationNode = {
what_works_decks: 'Decks (Anki hierarchy Foo::Bar becomes Foo / Bar).',
what_works_basic: 'Basic + Basic-Reverse: front/back directly.',
what_works_cloze: 'Cloze: {{c1::…}} is created with sub-index per cluster.',
what_works_media: 'Images + audio (embedded as Markdown / <audio> tag).',
what_skipped_title: 'What is not imported',
what_skipped_media: 'Images + audio (will arrive with the platform integration in a later phase).',
what_skipped_media: '— (Images + audio are imported since Sprint 9k, see above)',
what_skipped_history: 'FSRS learning history (Anki reviews are deliberately reset).',
what_skipped_addons: 'Add-on specific card types (image-occlusion etc.).',
anki_label: 'Import from Anki',
dropzone: '📦 Drop .apkg file here or click',
dropzone_hint: 'Basic, Basic + Reverse, Cloze · Images + audio are not imported in this phase.',
dropzone_hint: 'Basic, Basic + Reverse, Cloze · Images + audio are imported too (limit 25 MB per file).',
parsing: 'Reading {file}…',
preview_found: 'Found in',
preview_decks_one: '1 deck',
@ -143,17 +144,19 @@ export const en: TranslationNode = {
preview_cards_one: '1 card',
preview_cards: '{n} cards',
preview_breakdown: '({basic} basic, {basic_reverse} basic-reverse, {cloze} cloze)',
preview_media: '{n} media files (will NOT be imported in this phase)',
preview_media: '{n} media files will be uploaded',
preview_skipped: '{n} skipped (unknown type)',
preview_warnings: 'Notes ({n})',
cancel: 'Cancel',
import_now: 'Import',
stage_media: 'Uploading media · {current} / {total}',
stage_decks: 'Creating decks · {current} / {total}',
stage_cards: 'Importing cards · {current} / {total}',
stage_done: 'Done.',
done_summary_one: '✓ {cards} cards in 1 deck.',
done_summary: '✓ {cards} cards in {decks} decks.',
done_dupes: '{n} duplicates skipped (same content already exists).',
done_media: '{uploaded} media uploaded, {failed} failed.',
done_failures: '{n} errors',
done_more: 'Another file',
error_label: 'Error: {msg}',

View file

@ -6,6 +6,13 @@ marked.setOptions({
breaks: true,
});
// DOMPurify-Default lässt <audio>/<video>/<source> NICHT durch. Wir
// erlauben sie explizit, weil der Anki-Importer Audio-Tags einbettet.
// `src` muss auf einer eigenen Allowlist sein, damit kein
// `javascript:`-URI durchschlüpft.
const ADD_TAGS = ['audio', 'video', 'source'];
const ADD_ATTR = ['controls', 'preload', 'src', 'type', 'autoplay', 'loop'];
/**
* Markdown HTML, sanitized via DOMPurify.
* Sicher gegen Stored-XSS aus User-Card-Inhalten.
@ -18,5 +25,5 @@ export function renderMarkdown(source: string): string {
if (!source) return '';
const html = marked.parse(source, { async: false }) as string;
if (typeof window === 'undefined') return html; // SSR-Fallback (selten Pfad)
return DOMPurify.sanitize(html);
return DOMPurify.sanitize(html, { ADD_TAGS, ADD_ATTR });
}

View file

@ -30,6 +30,7 @@
<li>{t('import.what_works_decks')}</li>
<li>{t('import.what_works_basic')}</li>
<li>{t('import.what_works_cloze')}</li>
<li>{t('import.what_works_media')}</li>
</ul>
<div class="mt-2 mb-1 font-medium text-[var(--color-fg)]">{t('import.what_skipped_title')}</div>
<ul class="list-disc pl-4">

View file

@ -149,11 +149,23 @@ describe('parseApkg', () => {
});
describe('sanitizeAnkiHtml', () => {
it('strippt Bilder und Audio-Markup', () => {
it('droppt Bilder und Audio ohne URL-Map (lossy fallback)', () => {
const out = sanitizeAnkiHtml('Vorne <img src="paris.jpg"> Hinten [sound:audio.mp3] fertig.');
expect(out).toBe('Vorne Hinten fertig.');
});
it('ersetzt Bilder durch Markdown wenn URL-Map gesetzt', () => {
const map = new Map([['paris.jpg', '/api/v1/media/abc']]);
const out = sanitizeAnkiHtml('Vorne <img src="paris.jpg"> hinten.', map);
expect(out).toBe('Vorne ![paris.jpg](/api/v1/media/abc) hinten.');
});
it('ersetzt [sound:…] durch <audio> wenn URL-Map gesetzt', () => {
const map = new Map([['x.mp3', '/api/v1/media/xyz']]);
const out = sanitizeAnkiHtml('Vorne [sound:x.mp3] hinten.', map);
expect(out).toContain('<audio controls preload="metadata" src="/api/v1/media/xyz">');
});
it('konvertiert Bold/Italic zu Markdown', () => {
expect(sanitizeAnkiHtml('Das <b>ist</b> <i>wichtig</i>')).toBe('Das **ist** *wichtig*');
});

View file

@ -1,12 +1,11 @@
# Lokales Cards-Dev-Setup. Postgres-Container auf 5435,
# damit der mana-Plattform-Stack auf 5432 ungestört weiterläuft.
# Lokales Cards-Dev-Setup. Postgres-Container auf 5435 + MinIO auf
# 9100/9101 (Plattform nutzt 9000/9001 — wir bleiben isoliert).
#
# Start: pnpm docker:up (vom Repo-Root)
# Logs: pnpm docker:logs
# Stop: pnpm docker:down
#
# Daten persistieren in `infrastructure/.volumes/cards-postgres`,
# git-ignored.
# Daten persistieren in `infrastructure/.volumes/`, git-ignored.
services:
cards-postgres:
@ -26,3 +25,22 @@ services:
interval: 5s
timeout: 3s
retries: 10
cards-minio:
image: minio/minio:latest
container_name: cards-minio
restart: unless-stopped
command: server /data --console-address ':9001'
environment:
MINIO_ROOT_USER: cardsadmin
MINIO_ROOT_PASSWORD: cardsadmin
ports:
- '9100:9000' # S3-API
- '9101:9001' # Web-Console
volumes:
- ./.volumes/cards-minio:/data
healthcheck:
test: ['CMD', 'mc', 'ready', 'local']
interval: 5s
timeout: 3s
retries: 10

202
pnpm-lock.yaml generated
View file

@ -38,6 +38,9 @@ importers:
hono:
specifier: ^4.6.0
version: 4.12.18
minio:
specifier: ^8.0.7
version: 8.0.7
postgres:
specifier: ^3.4.0
version: 3.4.9
@ -567,6 +570,9 @@ packages:
resolution: {integrity: sha512-I1fIDbS3nu++9LUXc08ICrLXE/cdV/n9D0Jm8LOhVH9izUXQSSg2EO4M2+m7K5vc5KdjGBcYrFPhAg48+KE6Kw==}
hasBin: true
'@nodable/entities@2.1.0':
resolution: {integrity: sha512-nyT7T3nbMyBI/lvr6L5TyWbFJAI9FTgVRakNoBqCD+PmID8DzFrrNdLLtHMwMszOtqZa8PAOV24ZqDnQrhQINA==}
'@petamoriken/float16@3.9.3':
resolution: {integrity: sha512-8awtpHXCx/bNpFt4mt2xdkgtgVvKqty8VbjHI/WWWQuEw+KLzFot3f4+LkQY9YmOtq7A5GdOnqoIC8Pdygjk2g==}
@ -975,10 +981,23 @@ packages:
resolution: {integrity: sha512-Izi8RQcffqCeNVgFigKli1ssklIbpHnCYc6AknXGYoB6grJqyeby7jv12JUQgmTAnIDnbck1uxksT4dzN3PWBA==}
engines: {node: '>=12'}
async@3.2.6:
resolution: {integrity: sha512-htCUDlxyyCLMgaM3xXg0C0LW2xqfuQ6p05pCEIsXuyQ+a1koYKTuBMzRNwmybfLgvJDMd0r1LTn4+E0Ti6C2AA==}
axobject-query@4.1.0:
resolution: {integrity: sha512-qIj0G9wZbMGNLjLmg1PT6v2mE9AH2zlnADJD/2tC6E00hgmhUOfEB6greHPAfLRSufHqROIUTkw6E+M3lH0PTQ==}
engines: {node: '>= 0.4'}
block-stream2@2.1.0:
resolution: {integrity: sha512-suhjmLI57Ewpmq00qaygS8UgEq2ly2PCItenIyhMqVjo4t4pGzqMvfgJuX8iWTeSDdfSSqS6j38fL4ToNL7Pfg==}
browser-or-node@2.1.1:
resolution: {integrity: sha512-8CVjaLJGuSKMVTxJ2DpBl5XnlNDiT4cQFeuCJJrvJmts9YrTZDizTX7PjC2s6W4x+MBGZeEY6dGMrF04/6Hgqg==}
buffer-crc32@1.0.0:
resolution: {integrity: sha512-Db1SbgBS/fg/392AblrMJk97KggmvYhr4pB5ZIMTWtaivCPMWLkmb7m21cJvpvgK+J3nsU2CmmixNBZx4vFj/w==}
engines: {node: '>=8.0.0'}
buffer-from@1.1.2:
resolution: {integrity: sha512-E+XQCRwSbaaiChtv6k6Dwgc+bx+Bs6vuKJHHl5kox/BaKbhiXzqQOwK4cO22yElGp2OCmjwVhT3HmxgyPGnJfQ==}
@ -1024,6 +1043,10 @@ packages:
supports-color:
optional: true
decode-uri-component@0.2.2:
resolution: {integrity: sha512-FqUYQ+8o158GyGTrMFJms9qh3CqTKvAqgqsTnkLI8sKu0028orqBhxNMFkFen0zGyg6epACD32pjVk58ngIErQ==}
engines: {node: '>=0.10'}
deep-eql@5.0.2:
resolution: {integrity: sha512-h5k/5U50IJJFpzfL6nO9jaaumfjO/f2NjK/oYB2Djzm4p9L+3T9qWpZqZ2hAbLPuuYq9wrU08WQyBTL5GbPk5Q==}
engines: {node: '>=6'}
@ -1190,10 +1213,20 @@ packages:
estree-walker@3.0.3:
resolution: {integrity: sha512-7RUKfXgSMMkzt6ZuXmqapOurLGPPfgj6l9uRZ7lRGolvk0y2yocc35LdcxKC5PQZdn2DMqioAQ2NoWcrTKmm6g==}
eventemitter3@5.0.4:
resolution: {integrity: sha512-mlsTRyGaPBjPedk6Bvw+aqbsXDtoAyAzm5MO7JgU+yVRyMQ5O8bD4Kcci7BS85f93veegeCPkL8R4GLClnjLFw==}
expect-type@1.3.0:
resolution: {integrity: sha512-knvyeauYhqjOYvQ66MznSMs83wmHrCycNEN6Ao+2AeYEfxUIkuiVxdEa1qlGEPK+We3n0THiDciYSsCcgW/DoA==}
engines: {node: '>=12.0.0'}
fast-xml-builder@1.2.0:
resolution: {integrity: sha512-00aAWieqff+ZJhsXA4g1g7M8k+7AYoMUUHF+/zFb5U6Uv/P0Vl4QZo84/IcufzYalLuEj9928bXN9PbbFzMF0Q==}
fast-xml-parser@5.7.3:
resolution: {integrity: sha512-C0AaNuC+mscy6vrAQKAc/rMq+zAPHodfHGZu4sGVehvAQt/JLG1O5zEcYcXSY5zSqr4YVgxsB+pHXTq0i7eDlg==}
hasBin: true
fdir@6.5.0:
resolution: {integrity: sha512-tIbYtZbucOs0BRGqPJkshJUYdL+SDH7dVM8gjy+ERp3WAUjLEFJE+02kanyHtwjWOnwrKYBiwAmM0p4kLJAnXg==}
engines: {node: '>=12.0.0'}
@ -1203,6 +1236,10 @@ packages:
picomatch:
optional: true
filter-obj@1.1.0:
resolution: {integrity: sha512-8rXg1ZnX7xzy2NGDVkBVaAy+lSlPNwad13BtgSlLuxfIslyt5Vg64U7tFcCt4WS1R0hvtnQybT/IyCkGZ3DpXQ==}
engines: {node: '>=0.10.0'}
fsevents@2.3.3:
resolution: {integrity: sha512-5xoDfX+fL7faATnagmWPpbFtwh/R77WmMMqqHGS65C3vvB0YHrgF+B1YmZ3441tMj5n63k0212XNoJwzlhffQw==}
engines: {node: ^8.16.0 || ^10.6.0 || >=11.0.0}
@ -1236,6 +1273,10 @@ packages:
inherits@2.0.4:
resolution: {integrity: sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ==}
ipaddr.js@2.4.0:
resolution: {integrity: sha512-9VGk3HGanVE6JoZXHiCpnGy5X0jYDnN4EA4lntFPj+1vIWlFhIylq2CrrCOJH9EAhc5CYhq18F2Av2tgoAPsYQ==}
engines: {node: '>= 10'}
is-core-module@2.16.2:
resolution: {integrity: sha512-evOr8xfXKxE6qSR0hSXL2r3sd7ALj8+7jQEUvPYcm5sgZFdJ+AYzT6yNmJenvIYQBgIGwfwz08sL8zoL7yq2BA==}
engines: {node: '>= 0.4'}
@ -1343,6 +1384,9 @@ packages:
locate-character@3.0.0:
resolution: {integrity: sha512-SW13ws7BjaeJ6p7Q6CO2nchbYEc3X3J6WrmTTDto7yMPqVSZTUyY5Tjbid+Ab8gLnATtygYtiDIJGQRRn2ZOiA==}
lodash@4.18.1:
resolution: {integrity: sha512-dMInicTPVE8d1e5otfwmmjlxkZoUpiVLwyeTdUsi/Caj/gfzzblBcCE5sRHV/AsjuCmxWrte2TNGSYuCeCq+0Q==}
loupe@3.2.1:
resolution: {integrity: sha512-CdzqowRJCeLU72bHvWqwRBBlLcMEtIvGrlvef74kMnV2AolS9Y8xUv1I0U/MNAWMhBlKIoyuEgoJ0t/bbwHbLQ==}
@ -1354,6 +1398,18 @@ packages:
engines: {node: '>= 20'}
hasBin: true
mime-db@1.52.0:
resolution: {integrity: sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg==}
engines: {node: '>= 0.6'}
mime-types@2.1.35:
resolution: {integrity: sha512-ZDY+bPm5zTTF+YpCrAU9nK0UgICYPT0QtT1NZWFv4s++TNkcgVaT0g6+4R2uI4MjQjzysHB1zxuWL50hzaeXiw==}
engines: {node: '>= 0.6'}
minio@8.0.7:
resolution: {integrity: sha512-E737MgufW8CeQAsTAtnEMrxZ9scMSf29kkhZoXzDTKj/Jszzo2SfeZUH9wbDQH2Rsq6TCtl/yQL0+XdVKZansQ==}
engines: {node: ^16 || ^18 || >=20}
mri@1.2.0:
resolution: {integrity: sha512-tzzskb3bG8LvYGFF/mDTpq3jpI6Q9wc3LEmBaghu+DdCssd1FakN7Bc0hVNmEyGq1bq3RgfkCb3cmQLpNPOroA==}
engines: {node: '>=4'}
@ -1373,6 +1429,10 @@ packages:
pako@1.0.11:
resolution: {integrity: sha512-4hLB8Py4zZce5s4yd9XzopqwVv/yGNhV1Bl8NTmCq1763HeK2+EwVTv+leGeL13Dnh2wfbqowVPXCIO0z4taYw==}
path-expression-matcher@1.5.0:
resolution: {integrity: sha512-cbrerZV+6rvdQrrD+iGMcZFEiiSrbv9Tfdkvnusy6y0x0GKBXREFg/Y65GhIfm0tnLntThhzCnfKwp1WRjeCyQ==}
engines: {node: '>=14.0.0'}
path-parse@1.0.7:
resolution: {integrity: sha512-LDJzPVEEEPR+y48z93A0Ed0yXb8pAByGWo/k5YYdYgpY2/2EsOsksJrq7lOHxryrVOn1ejG6oAp8ahvOIQD8sw==}
@ -1412,9 +1472,17 @@ packages:
process-nextick-args@2.0.1:
resolution: {integrity: sha512-3ouUOpQhtgrbOa17J7+uxOTpITYWaGP7/AhoR3+A+/1e9skrzelGi/dXzEYyvbxubEF6Wn2ypscTKiKJFFn1ag==}
query-string@7.1.3:
resolution: {integrity: sha512-hh2WYhq4fi8+b+/2Kg9CEge4fDPvHS534aOOvOZeQ3+Vf2mCFsaFBYj0i+iXcAq6I9Vzp5fjMFBlONvayDC1qg==}
engines: {node: '>=6'}
readable-stream@2.3.8:
resolution: {integrity: sha512-8p0AUk4XODgIewSi0l8Epjs+EVnWiK7NoDIEGU0HhE7+ZyY8D1IMY7odu5lRrFXGg71L15KG8QrPmum45RTtdA==}
readable-stream@3.6.2:
resolution: {integrity: sha512-9u/sniCrY3D5WdsERHzHE4G2YCXqoG5FTHUiCC4SIbr6XcLZBY05ya9EKjYek9O5xOAwjGq+1JdGBAS7Q9ScoA==}
engines: {node: '>= 6'}
readdirp@4.1.2:
resolution: {integrity: sha512-GDhwkLfywWL2s6vEjyhri+eXmfH6j1L7JE27WhqLeYzoh/A3DBaYGEj2H/HFZCn/kMfim73FXxEJTw06WtxQwg==}
engines: {node: '>= 14.18.0'}
@ -1439,6 +1507,10 @@ packages:
safe-buffer@5.1.2:
resolution: {integrity: sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g==}
sax@1.6.0:
resolution: {integrity: sha512-6R3J5M4AcbtLUdZmRv2SygeVaM7IhrLXu9BmnOGmmACak8fiUtOsYNWUS4uK7upbmHIBbLBeFeI//477BKLBzA==}
engines: {node: '>=11.0.0'}
semver@7.7.4:
resolution: {integrity: sha512-vFKC2IEtQnVhpT78h1Yp8wzwrf8CM+MzKMHGJZfBtzhZNycRFnXsHk6E5TxIkkMsgNS7mdX3AGB7x2QM2di4lA==}
engines: {node: '>=10'}
@ -1472,6 +1544,10 @@ packages:
resolution: {integrity: sha512-UjgapumWlbMhkBgzT7Ykc5YXUT46F0iKu8SGXq0bcwP5dz/h0Plj6enJqjz1Zbq2l5WaqYnrVbwWOWMyF3F47g==}
engines: {node: '>=0.10.0'}
split-on-first@1.1.0:
resolution: {integrity: sha512-43ZssAJaMusuKWL8sKUBQXHWOpq8d6CfN/u1p4gUzfJkM05C8rxTmYrkIPTXapZpORA6LkkzcUulJ8FqA7Uudw==}
engines: {node: '>=6'}
sql.js@1.14.1:
resolution: {integrity: sha512-gcj8zBWU5cFsi9WUP+4bFNXAyF1iRpA3LLyS/DP5xlrNzGmPIizUeBggKa8DbDwdqaKwUcTEnChtd2grWo/x/A==}
@ -1481,9 +1557,22 @@ packages:
std-env@3.10.0:
resolution: {integrity: sha512-5GS12FdOZNliM5mAOxFRg7Ir0pWz8MdpYm6AY6VPkGpbA7ZzmbzNcBJQ0GPvvyWgcY7QAhCgf9Uy89I03faLkg==}
stream-chain@2.2.5:
resolution: {integrity: sha512-1TJmBx6aSWqZ4tx7aTpBDXK0/e2hhcNSTV8+CbFJtDjbb+I1mZ8lHit0Grw9GRT+6JbIrrDd8esncgBi8aBXGA==}
stream-json@1.9.1:
resolution: {integrity: sha512-uWkjJ+2Nt/LO9Z/JyKZbMusL8Dkh97uUBTv3AJQ74y07lVahLY4eEFsPsE97pxYBwr8nnjMAIch5eqI0gPShyw==}
strict-uri-encode@2.0.0:
resolution: {integrity: sha512-QwiXZgpRcKkhTj2Scnn++4PKtWsH0kpzZ62L2R6c/LUVYv7hVnZqcg2+sMuT6R7Jusu1vviK/MFsu6kNJfWlEQ==}
engines: {node: '>=4'}
string_decoder@1.1.1:
resolution: {integrity: sha512-n/ShnvDi6FHbbVfviro+WojiFzv+s8MPMHBczVePfUpDJLwoLT0ht1l4YwBCbi8pJAveEEdnkHyPyTP/mzRfwg==}
strnum@2.3.0:
resolution: {integrity: sha512-ums3KNd42PGyx5xaoVTO1mjU1bH3NpY4vsrVlnv9PNGqQj8wd7rJ6nEypLrJ7z5vxK5RP0yMLo6J/Gsm62DI5Q==}
supports-preserve-symlinks-flag@1.0.0:
resolution: {integrity: sha512-ot0WnXS9fgdkgIcePe6RHNk1WA8+muPa6cSjeR3V8K27q9BB1rTE3R1p7Hv0z1ZyAc8s6Vvv8DIyWf681MAt0w==}
engines: {node: '>= 0.4'}
@ -1507,6 +1596,9 @@ packages:
resolution: {integrity: sha512-uxc/zpqFg6x7C8vOE7lh6Lbda8eEL9zmVm/PLeTPBRhh1xCgdWaQ+J1CUieGpIfm2HdtsUpRv+HshiasBMcc6A==}
engines: {node: '>=6'}
through2@4.0.2:
resolution: {integrity: sha512-iOqSav00cVxEEICeD7TjLB1sueEL+81Wpzp2bY17uZjZN0pWZPuo4suZ/61VujxmqSGFfgOcNuTZ85QJwNZQpw==}
tinybench@2.9.0:
resolution: {integrity: sha512-0+DUvqWMValLmha6lr4kD8iAMK1HzV0/aKnCtWb9v9641TnP/MFb7Pc2bxoxQjTXAErryXVgUOfv2YqNllqGeg==}
@ -1627,6 +1719,18 @@ packages:
engines: {node: '>=8'}
hasBin: true
xml-naming@0.1.0:
resolution: {integrity: sha512-k8KO9hrMyNk6tUWqUfkTEZbezRRpONVOzUTnc97VnCvyj6Tf9lyUR9EDAIeiVLv56jsMcoXEwjW8Kv5yPY52lw==}
engines: {node: '>=16.0.0'}
xml2js@0.6.2:
resolution: {integrity: sha512-T4rieHaC1EXcES0Kxxj4JWgaUQHDk+qwHcYOCFHfiwKz7tOVPLq7Hjq9dM1WCMhylqMEfP7hMcOIChvotiZegA==}
engines: {node: '>=4.0.0'}
xmlbuilder@11.0.1:
resolution: {integrity: sha512-fDlsI/kFEx7gLvbecc0/ohLG50fugQp8ryHzMTuW9vSa1GJ0XYWKnhsUx7oie3G98+r56aTQIUB4kht42R3JvA==}
engines: {node: '>=4.0'}
zimmerframe@1.1.4:
resolution: {integrity: sha512-B58NGBEoc8Y9MWWCQGl/gq9xBCe4IiKM0a2x7GZdQKOW5Exr8S1W24J6OgM1njK8xCRGvAJIL/MxXHf6SkmQKQ==}
@ -1879,6 +1983,8 @@ snapshots:
dependencies:
zod: 3.25.76
'@nodable/entities@2.1.0': {}
'@petamoriken/float16@3.9.3': {}
'@polka/url@1.0.0-next.29': {}
@ -2215,8 +2321,18 @@ snapshots:
assertion-error@2.0.1: {}
async@3.2.6: {}
axobject-query@4.1.0: {}
block-stream2@2.1.0:
dependencies:
readable-stream: 3.6.2
browser-or-node@2.1.1: {}
buffer-crc32@1.0.0: {}
buffer-from@1.1.2: {}
bun-types@1.3.13:
@ -2251,6 +2367,8 @@ snapshots:
dependencies:
ms: 2.1.3
decode-uri-component@0.2.2: {}
deep-eql@5.0.2: {}
deepmerge@4.3.1: {}
@ -2387,12 +2505,28 @@ snapshots:
dependencies:
'@types/estree': 1.0.9
eventemitter3@5.0.4: {}
expect-type@1.3.0: {}
fast-xml-builder@1.2.0:
dependencies:
path-expression-matcher: 1.5.0
xml-naming: 0.1.0
fast-xml-parser@5.7.3:
dependencies:
'@nodable/entities': 2.1.0
fast-xml-builder: 1.2.0
path-expression-matcher: 1.5.0
strnum: 2.3.0
fdir@6.5.0(picomatch@4.0.4):
optionalDependencies:
picomatch: 4.0.4
filter-obj@1.1.0: {}
fsevents@2.3.3:
optional: true
@ -2425,6 +2559,8 @@ snapshots:
inherits@2.0.4: {}
ipaddr.js@2.4.0: {}
is-core-module@2.16.2:
dependencies:
hasown: 2.0.3
@ -2509,6 +2645,8 @@ snapshots:
locate-character@3.0.0: {}
lodash@4.18.1: {}
loupe@3.2.1: {}
magic-string@0.30.21:
@ -2517,6 +2655,28 @@ snapshots:
marked@18.0.3: {}
mime-db@1.52.0: {}
mime-types@2.1.35:
dependencies:
mime-db: 1.52.0
minio@8.0.7:
dependencies:
async: 3.2.6
block-stream2: 2.1.0
browser-or-node: 2.1.1
buffer-crc32: 1.0.0
eventemitter3: 5.0.4
fast-xml-parser: 5.7.3
ipaddr.js: 2.4.0
lodash: 4.18.1
mime-types: 2.1.35
query-string: 7.1.3
stream-json: 1.9.1
through2: 4.0.2
xml2js: 0.6.2
mri@1.2.0: {}
mrmime@2.0.1: {}
@ -2527,6 +2687,8 @@ snapshots:
pako@1.0.11: {}
path-expression-matcher@1.5.0: {}
path-parse@1.0.7: {}
pathe@1.1.2: {}
@ -2554,6 +2716,13 @@ snapshots:
process-nextick-args@2.0.1: {}
query-string@7.1.3:
dependencies:
decode-uri-component: 0.2.2
filter-obj: 1.1.0
split-on-first: 1.1.0
strict-uri-encode: 2.0.0
readable-stream@2.3.8:
dependencies:
core-util-is: 1.0.3
@ -2564,6 +2733,12 @@ snapshots:
string_decoder: 1.1.1
util-deprecate: 1.0.2
readable-stream@3.6.2:
dependencies:
inherits: 2.0.4
string_decoder: 1.1.1
util-deprecate: 1.0.2
readdirp@4.1.2: {}
resolve-pkg-maps@1.0.0: {}
@ -2612,6 +2787,8 @@ snapshots:
safe-buffer@5.1.2: {}
sax@1.6.0: {}
semver@7.7.4: {}
set-cookie-parser@3.1.0: {}
@ -2637,16 +2814,28 @@ snapshots:
source-map@0.6.1: {}
split-on-first@1.1.0: {}
sql.js@1.14.1: {}
stackback@0.0.2: {}
std-env@3.10.0: {}
stream-chain@2.2.5: {}
stream-json@1.9.1:
dependencies:
stream-chain: 2.2.5
strict-uri-encode@2.0.0: {}
string_decoder@1.1.1:
dependencies:
safe-buffer: 5.1.2
strnum@2.3.0: {}
supports-preserve-symlinks-flag@1.0.0: {}
svelte-check@4.4.8(picomatch@4.0.4)(svelte@5.55.5)(typescript@5.9.3):
@ -2686,6 +2875,10 @@ snapshots:
tapable@2.3.3: {}
through2@4.0.2:
dependencies:
readable-stream: 3.6.2
tinybench@2.9.0: {}
tinyexec@0.3.2: {}
@ -2791,6 +2984,15 @@ snapshots:
siginfo: 2.0.0
stackback: 0.0.2
xml-naming@0.1.0: {}
xml2js@0.6.2:
dependencies:
sax: 1.6.0
xmlbuilder: 11.0.1
xmlbuilder@11.0.1: {}
zimmerframe@1.1.4: {}
zod-to-json-schema@3.25.2(zod@3.25.76):