New backend endpoint that wraps mana-crawler + mana-llm in a single
call so the Kontext "Aus URL" UI can hit one route:
- Starts a crawl job (single page or up-to-20-page deep crawl) via
mana-crawler's /api/v1/crawl, polls status up to 90s, then fetches
paginated results.
- When multiple pages are returned, joins them into one markdown
document with H1-per-page section headers separated by ---.
- When summarize=true, routes the collected markdown through
mana-llm/chat/completions with a system prompt that asks for
"Überblick / Kernaussagen / Details" H2 structure in the source
language. sanitizeSummary() strips the common local-LLM artefacts
(```markdown fences, "Hier ist …:" preamble, stray leading H1)
so the output drops cleanly into the Kontext doc. On summary
failure the endpoint returns 502 rather than silently falling
back to the raw crawl.
- Credits are validated + consumed via @mana/shared-hono/credits
(1 credit crawl-only, 5 crawl+summary) under the new
AI_CONTEXT_IMPORT_URL action.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>