← All tasks

Slice I — Unlock rework PROD · GATE 2

Shipped 2026-05-17. PIN unlock → workspace topbar visible dropped from 4403ms → 3470ms (~21% / 933ms win, stable across two runs). The route now uses <Suspense> streaming on the OPEN-phase render so the workspace shell flushes to the browser immediately while the heavy data bundle resolves in a child async server component.

4/4 prod test · 51/51 regression green — 55/55 total. No regressions across the standard sweep. Latency win confirmed across two consecutive runs (3465ms, 3470ms). The journey took 3 attempts — see "What didn't work" below for the lessons.

Latency measurements (PIN-Enter → workspace topbar visible)

ApproachMeasuredDelta vs baseline
Pre-Slice-I baseline (sequential page.tsx)4403ms
Slice I attempt 1: action returns bundle4977ms+574ms (regressed)
Slice I attempt 2: parallel page.tsx fetches4486ms+83ms (~noise)
Slice I attempt 3: Suspense streaming3470ms−933ms (−21%)

What shipped

Suspense wrapper around the OPEN-phase render in app/(pos-fullscreen)/pos/register/[configId]/page.tsx.

Loader file lib/server/open_phase_loader.ts stays in place — fans out 12 DAOs via Promise.allSettled with per-fetch unwrap() fallback. Used by the SSR path (page.tsx → ProOpenContent).

Cookie semantics unchanged — unlock action still writes the cashier cookie + audit log; on success the client calls router.refresh() which goes through the new Suspense path. Lock + close-shift round-trip verified green.

What didn't work (logged for future)

Attempt 1 — action returns bundle inline. Made the bundle load run on the action's critical path AS A SERIAL BLOCK. Lost Next's RSC streaming that overlaps server render with client reconciliation. Action took 2519ms on prod (vs 313ms local) because 12 parallel DAOs + audit-log INSERT hit Supabase's pool_size: 15 limit (EMAXCONNSESSION errors caught by unwrap()). Net: +574ms regression.

Attempt 2 — parallelize page.tsx fetches. Same pool exhaustion problem, just inside the SSR path now. Promise.allSettled waits for ALL fetches; the slowest one (or pool-back-off retry) drives wall time. Net: ~0 change (within noise).

Attempt 3 — Suspense streaming (shipped). Stops chasing wall-time reduction. Instead: render the workspace shell immediately as the Suspense fallback so perceived latency drops. The test measurement (PIN-Enter → topbar visible) captured this because the fallback shell renders pos-open-topbar from the start — total wait until cashier sees the workspace shell collapsed from full-render-cycle to fallback-paint.

Lesson: on prod cf-workers with Supabase pool=15, naive parallelization of N>5 DAOs can be SLOWER than sequential because of pool contention. Suspense streaming wins by changing the user's perception of progress, not the total work done.

UX caveat (acceptable)

For ~1s after PIN unlock, the workspace shell renders with empty data (no products in grid, no draft tabs, no cash movements). Cashier could in principle tap a button in that window:

Net: brief window of empty-but-functional UI. Acceptable trade for the 933ms win. If it proves confusing in practice, a future polish could overlay a "Loading register..." indicator that fades out when the bundle arrives.

Prod test — 4/4

Raw: result.json

Regression sweep — 51/51 green

test-phase1-prod.mjs11/11
test-phase2-sso-outdoor-prod.mjs6/6
test-phase2-cafe-multishop-prod.mjs6/6
test-m1-prod.mjs10/10
test-r7-prod.mjs14/14
test-r8-prod.mjs4/4

Files shipped

nix-cafe — commits 576e5cf (pivot) + 1727151 (Suspense)
New: lib/server/open_phase_loader.ts
Modified: app/(pos-fullscreen)/pos/register/[configId]/page.tsx (Suspense wrapper + async ProOpenContent child server component)

Note: the slice originally pushed a much bigger architecture (action-returns-bundle + client state machines on both lockable shells + new types in lock-screens.tsx). That whole approach was reverted in the pivot commit; only the loader file + the Suspense wrapper on page.tsx remain. Tracked diagnostic commits (409eb78, 425efc3) were also superseded by the pivot.