fix: 4 boot-time noise + correctness bugs surfaced by post-deploy smoke

All four were pre-existing; the audit smoke-test made them visible. Fixed
together because they share a "boot console-warn cleanup" theme.

1. streaks ensureSeeded race (DexieError2 ×2)
   - Two boot-time liveQuery callers passed the `count > 0` check before
     either had written, then the second's `.add()` hit a ConstraintError.
   - Fix: cache the seed promise per module, run the existence check +
     bulkAdd inside one Dexie RW transaction, and only insert MISSING
     defs (preserves existing currentStreak/longestStreak counts).

2. encryptRecord('agents', …) "wrong table name?" warning
   - The DEV-only check fired whenever a record carried none of the
     registered encrypted fields, regardless of whether anything could
     actually leak. `ensureDefaultAgent` writes a fresh agent row before
     `systemPrompt` / `memory` exist — pure noise.
   - Fix: drop the "no fields at all" branch. Keep the case-mismatch
     branch (the branch that actually catches silent plaintext leaks).

3. Passkey signInWithPasskey "Cannot read properties of undefined
   (reading 'allowCredentials')"
   - Client destructured `{ options, challengeId }` from the server's
     options response, but Better-Auth's `@better-auth/passkey` plugin
     returns the raw PublicKeyCredentialRequestOptionsJSON (no
     envelope) and tracks the challenge in a signed cookie. Both
     `options` and `challengeId` came back undefined; SimpleWebAuthn
     blew up the moment it tried to read the request shape. Verify body
     `{ challengeId, credential }` was likewise wrong — Better-Auth
     wants `{ response }`.
   - Fix: align both register and authenticate flows with Better-Auth's
     native shape on options + verify, and add `credentials: 'include'`
     on every fetch so the challenge cookie actually round-trips.
     Server's verify proxy now reads `parsed?.response?.id` for
     credentialID rate-limiting.

4. /api/v1/me/onboarding/ → 404
   - Hono's nested router (`app.route(prefix, sub)` + inner
     `app.get('/')`) matches the prefix-without-slash form only. The
     onboarding-status store sent the request with a trailing slash, so
     every login produced a 404 + a console warn.
   - Fix: client sends the path without trailing slash; mana-auth picks
     up `hono/trailing-slash` middleware as defense-in-depth so a future
     accidental trailing slash on any /me/* route 301-redirects instead
     of 404-ing.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This commit is contained in:
Till JS 2026-04-28 14:56:24 +02:00
parent 44f9155ed3
commit 0c30a16eb5
6 changed files with 106 additions and 55 deletions

View file

@ -87,16 +87,20 @@ export class VaultLockedError extends Error {
* Dev-only registry-vs-record shape check.
*
* Called from encryptRecord when `import.meta.env.DEV` is truthy (Vite
* strips the call in production builds). Catches the most common silent
* failure mode: a registry entry names a field the record doesn't have,
* because of a case typo. Without this warning, the field stays plaintext
* forever and no error is ever thrown.
* strips the call in production builds). Catches the genuine silent
* failure mode: a registry entry names a field the record has only
* under a case-mismatched key. Without this warning, the typo'd field
* stays plaintext forever and no error is ever thrown.
*
* False-positive strategy:
* - We only warn on close matches (case-insensitive). An optional field
* that happens to be omitted from a given write won't light up.
* - A record that has NONE of the registered fields is also flagged,
* which catches wrong-table-name call sites.
* What the check explicitly does NOT flag:
* - Records that have NONE of the registered fields. Many call sites
* legitimately encrypt records before any optional encrypted field
* has been set (e.g. `ensureDefaultAgent` writes a fresh agent row
* without a `systemPrompt` or `memory` yet those are filled in
* later when the user customises the agent). Encrypting such a
* record is a no-op anyway, so warning is pure noise.
* - Optional fields that just happen to be undefined for this write.
* Same reason no leak possible without a value to leak.
*
* Throttled per (tableName, field) pair so liveQuery loops don't spam.
*/
@ -111,12 +115,8 @@ function devCheckRegistryShape(
const lcMap = new Map<string, string>();
for (const k of recordKeys) lcMap.set(k.toLowerCase(), k);
let exactHits = 0;
for (const field of fields) {
if (recordKeySet.has(field)) {
exactHits++;
continue;
}
if (recordKeySet.has(field)) continue;
// Case-insensitive near-miss → almost certainly a typo in the registry.
const near = lcMap.get(field.toLowerCase());
if (near && near !== field) {
@ -131,21 +131,6 @@ function devCheckRegistryShape(
}
}
}
// Record has no registered field at all — probably wrong tableName or
// a record shape that diverged from the type the registry was written for.
if (exactHits === 0 && recordKeys.length > 0) {
const key = `${tableName}:no-fields`;
if (!_registryWarnings.has(key)) {
_registryWarnings.add(key);
console.warn(
`[mana-crypto] DEV: encryptRecord('${tableName}', ...) called but the record ` +
`has none of the registered fields [${fields.join(', ')}]. ` +
`Keys on record: [${recordKeys.slice(0, 10).join(', ')}${recordKeys.length > 10 ? ', …' : ''}]. ` +
`Wrong table name?`
);
}
}
}
/**

View file

@ -170,22 +170,48 @@ export function stopStreakTracker(): void {
// ── Seed defaults ───────────────────────────────────
/**
* Cache the in-flight (or completed) seed so concurrent first-callers
* await the same operation instead of each starting their own race.
* `useStreaks()` is called from multiple components and the live-query
* machinery re-runs `buildAllStreaks` on every `_streakState` change
* event without this guard, two boot-time callers would each pass
* the `count > 0` check before either had written anything, and the
* second's `.add()` would throw a ConstraintError.
*/
let seedPromise: Promise<void> | null = null;
async function ensureSeeded(): Promise<void> {
const count = await db.table(TABLE).count();
if (count > 0) return;
// Seed empty states so useStreaks() returns all definitions. Same
// attribution reasoning as markActive — this is a subsystem write.
return (seedPromise ??= seedImpl());
}
async function seedImpl(): Promise<void> {
// Subsystem write — attribute to the projection actor, not to
// whoever triggered the upstream read.
await runAsAsync(PROJECTION_ACTOR, async () => {
for (const def of STREAK_DEFS) {
await db.table(TABLE).add({
id: def.id,
label: def.label,
moduleId: def.moduleId,
currentStreak: 0,
longestStreak: 0,
lastActiveDate: '',
});
}
// Single Dexie RW transaction: the existence check and the
// inserts share the same atomic scope, so even if two browser
// tabs / two component mounts hit this in the same microtask,
// only one transaction sees an empty table and writes the
// defaults. The other sees them and skips.
await db.transaction('rw', TABLE, async () => {
const existingIds = new Set((await db.table<StreakState>(TABLE).toArray()).map((r) => r.id));
const missing = STREAK_DEFS.filter((d) => !existingIds.has(d.id));
if (missing.length === 0) return;
// bulkAdd is faster than a per-row loop and atomic-on-failure
// inside the open transaction. Only the missing definitions
// land — existing rows keep their currentStreak / longestStreak.
await db.table(TABLE).bulkAdd(
missing.map((def) => ({
id: def.id,
label: def.label,
moduleId: def.moduleId,
currentStreak: 0,
longestStreak: 0,
lastActiveDate: '',
}))
);
});
});
}

View file

@ -68,7 +68,11 @@ function createOnboardingStatusStore() {
if (!browser || loading) return;
loading = true;
try {
const res = await authedFetch('/');
// Empty path — `${baseUrl}/api/v1/me/onboarding` without a
// trailing slash. Hono's nested router (`app.route(prefix,
// sub)` + inner `app.get('/')`) matches the prefix exactly,
// not the prefix-with-slash form, so a `/` here would 404.
const res = await authedFetch('');
if (!res.ok) throw new Error(`GET /onboarding → ${res.status}`);
const data = (await res.json()) as { completedAt: string | null };
({ completedAt } = parseStatus(data));

View file

@ -429,7 +429,16 @@ export function createAuthService(config: AuthServiceConfig): AuthServiceInterfa
},
/**
* Register a new passkey for the current user
* Register a new passkey for the current user.
*
* The shape of the server's options response and the verify
* request body match Better-Auth's `@better-auth/passkey` plugin
* exactly: the options endpoint returns the raw
* PublicKeyCredentialCreationOptionsJSON (no envelope), and the
* verify endpoint accepts `{ response, name? }`. The challenge
* is carried in a server-set signed cookie that's why every
* fetch in the flow MUST send `credentials: 'include'` so the
* cookie survives the round-trip.
*/
async registerPasskey(friendlyName?: string): Promise<AuthResult> {
try {
@ -440,6 +449,7 @@ export function createAuthService(config: AuthServiceConfig): AuthServiceInterfa
// Step 1: Get registration options from server
const optionsRes = await fetch(`${baseUrl}${endpoints.passkeyRegisterOptions}`, {
method: 'POST',
credentials: 'include',
headers: {
'Content-Type': 'application/json',
Authorization: `Bearer ${appToken}`,
@ -451,19 +461,22 @@ export function createAuthService(config: AuthServiceConfig): AuthServiceInterfa
return { success: false, error: err.message || 'Failed to get registration options' };
}
const { options, challengeId } = await optionsRes.json();
const webauthnOptions = await optionsRes.json();
// Step 2: Create credential via browser WebAuthn API
const credential = await startRegistration({ optionsJSON: options });
const credential = await startRegistration({ optionsJSON: webauthnOptions });
// Step 3: Send credential to server for verification
// Step 3: Send credential to server for verification.
// `name` is the Better-Auth parameter name for the
// passkey label; `response` is the credential payload.
const verifyRes = await fetch(`${baseUrl}${endpoints.passkeyRegisterVerify}`, {
method: 'POST',
credentials: 'include',
headers: {
'Content-Type': 'application/json',
Authorization: `Bearer ${appToken}`,
},
body: JSON.stringify({ challengeId, credential, friendlyName }),
body: JSON.stringify({ response: credential, name: friendlyName }),
});
if (!verifyRes.ok) {
@ -493,6 +506,13 @@ export function createAuthService(config: AuthServiceConfig): AuthServiceInterfa
* where the browser surfaces passkeys directly inside the email autofill
* dropdown instead of opening a modal. The host MUST verify
* `PublicKeyCredential.isConditionalMediationAvailable()` first.
*
* Server / client shape matches Better-Auth's `@better-auth/passkey`
* plugin exactly: options endpoint returns the raw
* PublicKeyCredentialRequestOptionsJSON (no envelope), verify endpoint
* accepts `{ response: credential }`. The challenge lives in a signed
* cookie set by the server, so every fetch MUST send `credentials:
* 'include'` for the cookie to round-trip.
*/
async signInWithPasskey(options: { conditional?: boolean } = {}): Promise<AuthResult> {
try {
@ -502,6 +522,7 @@ export function createAuthService(config: AuthServiceConfig): AuthServiceInterfa
// Step 1: Get authentication options from server
const optionsRes = await fetch(`${baseUrl}${endpoints.passkeyAuthOptions}`, {
method: 'POST',
credentials: 'include',
headers: { 'Content-Type': 'application/json' },
});
@ -518,7 +539,7 @@ export function createAuthService(config: AuthServiceConfig): AuthServiceInterfa
};
}
const { options: webauthnOptions, challengeId } = await optionsRes.json();
const webauthnOptions = await optionsRes.json();
// Step 2: Authenticate via browser WebAuthn API
const credential = await startAuthentication({
@ -526,11 +547,14 @@ export function createAuthService(config: AuthServiceConfig): AuthServiceInterfa
useBrowserAutofill: options.conditional === true,
});
// Step 3: Send credential to server for verification
// Step 3: Send credential to server for verification.
// Better-Auth expects `{ response: credential }` — the
// challenge is read from the signed cookie, not the body.
const verifyRes = await fetch(`${baseUrl}${endpoints.passkeyAuthVerify}`, {
method: 'POST',
credentials: 'include',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ challengeId, credential }),
body: JSON.stringify({ response: credential }),
});
if (!verifyRes.ok) {

View file

@ -7,6 +7,7 @@
import { Hono } from 'hono';
import { cors } from 'hono/cors';
import { trimTrailingSlash } from 'hono/trailing-slash';
import { loadConfig } from './config';
import { getDb } from './db/connection';
import { createBetterAuth } from './auth/better-auth.config';
@ -78,6 +79,14 @@ const app = new Hono();
app.onError(errorHandler);
app.use('*', requestLogger());
// Defense-in-depth for clients that accidentally request the trailing-slash
// form of a route (e.g. `/api/v1/me/onboarding/`). Hono's nested-router root
// match-up doesn't include the prefix-with-slash variant, so without this
// middleware those clients get a 404 even though the same path-without-slash
// would work. Trims the slash and 301-redirects on GET/HEAD, only when a
// non-trimmed lookup already produced a 404 — so the legitimate root path
// `/` is never touched.
app.use('*', trimTrailingSlash());
app.use(
'*',
cors({

View file

@ -159,14 +159,17 @@ export function createPasskeyRoutes(
// Clone the body before the upstream read so we can extract
// credentialID for rate-limit bookkeeping without double-
// consuming the stream. The client sends
// `{ challengeId, credential: { id: '<base64url>' } }`.
// consuming the stream. The client sends Better-Auth's shape
// `{ response: { id: '<base64url>', ... } }` — see
// `verifyPasskeyAuthenticationBodySchema` in the upstream
// @better-auth/passkey plugin. Falls back to a flat `{ id }`
// body for any direct-to-mana-auth caller (legacy harness).
let credentialId: string | null = null;
let bodyText: string | null = null;
try {
bodyText = await c.req.text();
const parsed = JSON.parse(bodyText);
credentialId = parsed?.credential?.id ?? parsed?.id ?? null;
credentialId = parsed?.response?.id ?? parsed?.id ?? null;
} catch {
// Body malformed — let the upstream handler return a real
// validation error. No rate-limit bump because we don't