Code Splitting in SolidJS: Lazy Loading Done Right

How we structured our SolidJS app for optimal code splitting - Suspense boundaries, lazy components, and the patterns that actually work.

Code Splitting in SolidJS: Lazy Loading Done Right

We cut our main bundle from 1,335 KB to 591 KB. A 56% reduction. The secret wasn’t some clever Vite config—it was understanding where lazy loading actually matters in SolidJS.

Here’s what we learned building Ditto’s code splitting architecture.

The Problem: One Giant Library

Our bundle had a problem named vis-network. This graph visualization library weighs 558 KB and was being loaded on every page load—even though most users never touch the memory network feature.

Classic code splitting scenario. Except SolidJS has some quirks that make the standard React patterns fail.

SolidJS lazy() Is Not React.lazy()

SolidJS provides a lazy() function that looks familiar:

// Looks like React, right?
const MemoryNetwork = lazy(() => import("@/components/modals/MemoryNetwork"));

But the mental model is different. In React, components re-render when props change. In SolidJS, the component function runs once. This affects how Suspense boundaries behave.

We started with the obvious approach:

const modalRegistry = {
  memoryNetwork: {
    component: () => (
      <Suspense fallback={<LoadingSpinner />}>
        <MemoryNetwork />
      </Suspense>
    ),
  },
};

This works. But it creates problems.

The Suspense Boundary Problem

Standard <Suspense> in SolidJS can cause unexpected behavior. The boundary location affects what gets hidden during loading. Parent suspense boundaries can hide unrelated content. And you get less control over fallback timing.

We needed something more predictable.

LazyShow: A Better Pattern

We built a custom LazyShow component that combines SolidJS’s createResource with Show:

function LazyShow<T extends Component>(props: {
  load: () => Promise<{ default: T }>;
  fallback: JSX.Element;
  children: (Comp: T) => JSX.Element;
}) {
  const [mod] = createResource(props.load);

  return (
    <Show when={mod()} fallback={props.fallback}>
      {(m) => props.children(m().default)}
    </Show>
  );
}

Key benefits:

  • Skeleton renders synchronously (no waiting for the JS chunk)
  • Chunk loading happens in parallel with skeleton display
  • Once loaded, the component handles its own data fetching
  • No suspense boundary propagation issues

Usage looks like this:

const modalRegistry = {
  settings: {
    component: () => (
      <LazyShow
        load={() => import("@/components/modals/Settings")}
        fallback={<SettingsSkeletonModal />}
      >
        {(Settings) => <Settings />}
      </LazyShow>
    ),
  },
};

The Three-Phase Loading Model

Our architecture separates loading into three distinct phases:

Phase 1: Synchronous (instant)

  • Modal state updates
  • Modal wrapper renders
  • Skeleton content appears

Phase 2: Async chunk loading (~50-200ms)

  • Skeleton visible during this phase
  • Dynamic import fetches the JS chunk
  • User sees immediate feedback

Phase 3: Component render

  • Real component replaces skeleton
  • Component triggers its own data fetches
  • Component handles its own loading states

The key insight: skeletons only display during chunk loading, not data fetching. This creates a perception of instant response.

Skeleton Design That Works

Our skeletons follow specific principles:

  1. Synchronous rendering — Plain Tailwind divs, no async operations
  2. Accurate structure — Match actual component layout closely
  3. Real icons — Use actual Lucide icons (opacity-reduced) instead of placeholder rectangles
  4. Subtle animation — 2-second pulse, not aggressive scaling

Here’s our Settings skeleton:

export default function SettingsSkeleton() {
  return (
    <div class="flex flex-col h-full">
      {/* Tab bar with actual icons */}
      <div class="flex border-b border-border px-2 pt-2">
        <div class="flex items-center gap-1.5 px-3 py-2 border-b-2 border-primary">
          <Crown size={16} class="opacity-60" />
          <span class="text-sm font-medium opacity-60">Account</span>
        </div>
        {/* Other tabs... */}
      </div>

      {/* Content placeholders with animate-pulse */}
      <div class="flex-1 p-4 space-y-6">
        <div class="h-24 bg-muted/60 rounded-lg animate-[pulse_2s_ease-in-out_infinite]" />
        {/* More placeholders... */}
      </div>
    </div>
  );
}

Using real icons creates visual continuity. The transition from skeleton to real component feels seamless.

Background Prefetching

Lazy loading is useless if users wait every time they open a modal. We added prefetching during browser idle time:

export function prefetchModals() {
  const prefetch = () => {
    // Settings first (most commonly accessed)
    import("@/components/modals/Settings");
    import("@/components/modals/ImageViewer");
    import("@/components/modals/MemoriesDashboard/MemoriesDashboardOverlay");
    import("@/components/modals/MemoryNetwork");
    import("@/components/modals/PersonalityAssessments/PersonalityAssessmentOverlay");
  };

  if ("requestIdleCallback" in window) {
    requestIdleCallback(prefetch);
  } else {
    // Safari fallback
    setTimeout(prefetch, 100);
  }
}

Called on HomeScreen mount. By the time users want to open a modal, the chunk is already cached.

Safari doesn’t support requestIdleCallback, hence the fallback. 100ms is fast enough to prefetch before user interaction but after initial render completes.

Button Loading States

What if users click before prefetch completes? We added loading states to TopBar buttons:

const lazyModalImports: Partial<Record<ModalId, () => Promise<unknown>>> = {
  settings: () => import("@/components/modals/Settings"),
  memories: () => import("@/components/modals/MemoriesDashboard/..."),
};

const handleClick = async () => {
  const lazyImport = lazyModalImports[props.modalId];
  if (lazyImport) {
    setIsLoading(true);
    try {
      await lazyImport();
    } finally {
      setIsLoading(false);
    }
  }
  modal.createOpenHandler(props.modalId)();
};

If prefetch already loaded the chunk, this resolves instantly. If not, users see a spinner on the button.

What We Lazy Load (And What We Don’t)

Not everything needs lazy loading. We targeted heavy, infrequently-accessed components:

Lazy loaded:

  • MemoryNetwork / SubjectsManagementView (568 KB — vis-network library)
  • PersonalityAssessment (43 KB)
  • MemoriesDashboard (14 KB)
  • ImageViewer (8 KB)

Kept synchronous:

  • ComposeModal (frequently used)
  • TokenModal (small)
  • WhatsNew (small, static)
  • Settings (shares dependencies with main bundle)

The vis-network isolation was the big win. That single change moved 568 KB out of the main bundle.

Vite Did The Right Thing

We didn’t need any manual manualChunks configuration. Vite automatically created optimal chunks based on our lazy() boundaries.

There was one complexity: WidescreenLayout also imports MemoriesNetworkGraph. We worried this might pull vis-network back into the main bundle. But Vite recognized the shared dependency and created a shared chunk that’s only loaded when either lazy component needs it.

Trust your bundler. It’s smarter than you think.

The Results

MetricBeforeAfterImprovement
Main bundle1,335 KB613 KB54% smaller
Gzip main~400 KB188 KB53% smaller
vis-networkIn mainDeferred568 KB on demand
Total chunks410+More granular

Initial load is faster. Memory network users still get their feature. Everyone wins.

Lessons Learned

1. Separate chunk loading from data loading. Skeletons should only show during chunk loading. Let components handle their own data states.

2. Don’t use generic skeletons. A skeleton that doesn’t match the component layout creates jarring transitions. Take the time to build accurate placeholders.

3. Prefetch during idle time. Lazy loading without prefetching just moves the wait. Use requestIdleCallback to load chunks before users need them.

4. LazyShow > Suspense for modals. At least in SolidJS. The explicit fallback handling is more predictable than Suspense boundary behavior.

5. Target the big dependencies. Our entire optimization was essentially “don’t load vis-network on page load.” Find your vis-network.

What’s Next

We’re considering lazy loading the entire WidescreenLayout for mobile-only users. No point loading desktop features on phones.

Service worker caching for chunks is also on the list. Once a user loads a chunk, it should be instant forever.

Code splitting isn’t a one-time fix. It’s an ongoing practice of watching bundle sizes and questioning what really needs to load upfront.


Building something with SolidJS? I’m happy to chat about what we learned. Reach out anytime.

We’ve open-sourced our SolidJS patterns (including LazyShow and code splitting) as a skill for AI coding agents:

npx skills add https://github.com/omniaura/skills --skill solidjs-patterns

— Peyton