How We Measured Our SolidJS Performance Gains
You can claim any performance number you want. 10x faster! 100x smaller! The hard part is measuring it correctly.
When we migrated Ditto from React to SolidJS, we wanted real numbers. Not synthetic benchmarks. Not “it feels faster.” Actual, reproducible measurements comparing the old and new versions running the same code, handling the same interactions.
Here’s how we did it, and the mistakes that almost gave us garbage data.
The Setup
We tested two live deployments side-by-side:
- React version: Production at
assistant.heyditto.ai - SolidJS version: Firebase preview build from PR #641
Both were tested using Chrome’s built-in Performance APIs via browser automation. Same prompts, same interactions, same measurement code running against both tabs.
The First Gotcha: Invisible Tabs Don’t Paint
Our first round of testing showed N/A for First Contentful Paint on the SolidJS version. The React version reported 576ms. Seemed like a clear win for React, right?
Wrong. Paint timing APIs only record metrics when document.visibilityState === 'visible'. We were running tests with tabs in the background, so the browser literally wasn’t painting anything.
The fix: Run both tabs side-by-side, both visible, then hard refresh each one. Now we got real FCP numbers: 332ms for SolidJS vs 576ms for React.
The Second Gotcha: Cache Makes Everything Look Amazing
Our early SolidJS measurements showed TTFB (Time to First Byte) of 1-2ms. React was showing 50-175ms. A 50-100x improvement in server response time from a framework change? That’s not how any of this works.
The SolidJS preview was cached locally. React was hitting the CDN fresh.
The fix: Hard refresh (Cmd+Shift+R) on both tabs to bypass cache. Final TTFB measurements: 56ms for SolidJS, 52ms for React. Basically identical, as expected.
The Third Gotcha: React’s Numbers Wouldn’t Sit Still
React’s Load Complete metric varied wildly between test runs: anywhere from 190ms to 1,178ms. Same page, same network, wildly different results.
This turned out to be a side effect of aggressive code splitting. React was loading 24 separate script files, each a potential network request. SolidJS loaded 4. More requests means more variance.
The fix: Run multiple tests and document the ranges. The variability itself became part of the story.
What We Actually Measured
Here’s the measurement code we ran against both tabs:
// Performance timing from Navigation API
const perf = performance.getEntriesByType("navigation")[0];
const fcp = performance.getEntriesByName("first-contentful-paint")[0];
// Memory via Chrome's proprietary API
const memory = performance.memory;
// DOM complexity
const domElements = document.querySelectorAll("*").length;
// Script execution time
const scripts = performance
.getEntriesByType("resource")
.filter((r) => r.initiatorType === "script");
const totalScriptDuration = scripts.reduce((sum, s) => sum + s.duration, 0);
We also ran stress tests: send three messages, measure memory after each one. Both apps stayed stable (SolidJS at 32-33 MB, React at 38-41 MB before garbage collection).
The Numbers That Mattered
After fixing our measurement methodology:
| Metric | SolidJS | React | Difference |
|---|---|---|---|
| First Contentful Paint | 332ms | 576ms | 42% faster |
| Initial Memory | 34 MB | 67 MB | 49% less |
| DOM Elements | 995 | 1,384 | 28% fewer |
| Script Count | 4 | 24 | 83% fewer |
| Total Script Duration | 145ms | 1,105ms | 7.6x faster |
| Transfer Size | 376 KB | 683 KB | 45% smaller |
| DOM Content Loaded | 277ms | 203ms | React 27% faster |
| Load Complete | 301ms | 258ms | React 14% faster |
The 7.6x script duration improvement was the standout. That’s not a typo. SolidJS’s fine-grained reactivity means the browser spends dramatically less time executing JavaScript.
Where React won: DOMContentLoaded and Load Complete were faster in React. Why? React’s aggressive code splitting (24 scripts vs 4) means the browser can start parsing and executing code in smaller chunks. SolidJS’s single larger bundle takes longer for the browser to fully process. But those internal browser metrics don’t translate to user-perceived speed—FCP (what users actually see) was 42% faster with SolidJS.
The Qualitative Test That Actually Mattered
Numbers are great. But the test that convinced me we were on the right track? Using Ditto on my phone.
The React version made my phone warm during extended conversations. The SolidJS version doesn’t. That’s not a metric you can capture with Performance APIs, but it’s the one users will notice most.
Lessons for Your Own Testing
If you’re comparing framework performance, here’s what I learned:
-
Both tabs must be visible when measuring paint timing. Background tabs don’t paint.
-
Hard refresh everything. Cached resources will give you fantasy numbers.
-
Run multiple tests. Code splitting and network variance can swing results by 5x.
-
Measure what users feel. Script duration matters more than bundle size. Memory matters more than load time.
-
Test interactions, not just loads. Our stress test (3 messages in sequence) caught memory patterns that initial load testing missed.
The tools are all built into Chrome. You don’t need a fancy testing framework. You need discipline about controlling variables and skepticism about numbers that seem too good.
The Takeaway
Our SolidJS migration is real, and the performance gains are real. But getting accurate numbers required failing several times first.
The 7.6x script execution improvement comes from SolidJS’s fundamentally different reactivity model. The 49% memory reduction comes from not carrying around React’s virtual DOM. The 42% faster FCP comes from shipping less code.
These aren’t magic. They’re tradeoffs. SolidJS requires you to think differently about component structure (no destructuring props, no early returns). But for an app like Ditto that runs continuously on mobile devices, the performance wins are worth the mental model shift.
Want to see the full technical report? Check out our migration story for the complete breakdown.