Reverse-Engineering ChatGPT’s Scroll Behavior
We spent over a year fighting scroll behavior in our chat interface. Chrome OS Android toolbar show/hide breaking viewport height. iOS Safari nested scroll container confusion. Token streaming auto-scroll that users couldn’t interrupt. A major TypeScript refactor consolidating three overlapping scroll flags. A React-to-SolidJS migration. Hundreds of commits, dozens of internal documents, multiple complete architectural rewrites.
Then we reverse-engineered ChatGPT and discovered they use zero JavaScript scroll manipulation.
Here’s the story of how we deleted 80% of our scroll code and ended up with something that feels better.
The Problem We Were Trying to Solve
Chat streaming is deceptively complex. As an AI response streams in, new content appears at the bottom of the conversation. The UI needs to:
- Show the streaming content as it arrives
- Not hijack the user’s scroll if they’ve scrolled up to read earlier messages
- Feel smooth, not jittery
- Work on mobile (where scroll physics are more sensitive)
Our first implementation tried to be smart. We tracked whether the user was “at the bottom” and auto-scrolled to follow new content. We had effects watching streaming state, scroll event listeners, RAF loops to smooth out positioning.
It worked. But it felt off. There was a subtle jitter during streaming, like the UI was fighting itself.
The Investigation
We decided to see how ChatGPT handles it. Using Chrome’s JavaScript console, we intercepted their scroll APIs:
// Intercept scrollTo
const originalScrollTo = Element.prototype.scrollTo;
Element.prototype.scrollTo = function (...args) {
console.log("[INTERCEPT] scrollTo called", args);
return originalScrollTo.apply(this, args);
};
// Intercept scrollIntoView
const originalScrollIntoView = Element.prototype.scrollIntoView;
Element.prototype.scrollIntoView = function (...args) {
console.log("[INTERCEPT] scrollIntoView called", args);
return originalScrollIntoView.apply(this, args);
};
// Track scroll events
document.addEventListener(
"scroll",
() => {
scrollEventCount++;
},
{ capture: true },
);
We sent a message and watched the console during streaming.
Result: 0 scrollTo() calls. 0 scrollIntoView() calls. 0 scrollTop setter calls.
ChatGPT uses zero JavaScript scroll manipulation during streaming. The browser handles everything automatically.
What ChatGPT Actually Does
We dug into their DOM structure and CSS. Here’s what we found:
div.scroll-container (overflow-y: auto)
├── main
│ └── div.messages-area (flex-grow: 1)
│ ├── div.message-list (padding-bottom: 100px)
│ │ ├── article[user message]
│ │ └── article[assistant message]
│ │
│ └── div.composer (position: sticky; bottom: 0)
The key CSS properties:
| Element | Property | Value |
|---|---|---|
| Scroll container | overflow-y | auto |
| Scroll container | overflow-anchor | auto (browser default) |
| Composer | position | sticky |
| Composer | bottom | 0 |
That’s it. No JavaScript scroll logic. They rely entirely on the browser’s native CSS scroll anchoring.
How CSS Scroll Anchoring Works
The overflow-anchor: auto property (browser default since 2017) tells the browser to maintain the user’s visual scroll position when content is added.
Here’s the behavior:
- Browser picks an “anchor node”—typically the first visible element in the scroll container
- When content is added or removed, the browser adjusts
scrollTopto keep the anchor node in the same viewport position - This happens automatically, without any JavaScript
During ChatGPT streaming:
- Content grows at the bottom
- Browser adjusts scroll position to keep visible content stable
- If the user was at the bottom, they stay at the bottom
- If they scrolled up, they stay where they are
The browser already solved this problem.
Our Original Architecture (The Mistake)
Here’s a simplified version of what we had:
// Track if user is "at bottom"
const [isAtBottom, setIsAtBottom] = createSignal(true);
// Scroll event listener
scrollContainer.addEventListener("scroll", () => {
const atBottom =
scrollContainer.scrollHeight - scrollContainer.scrollTop <=
scrollContainer.clientHeight + 150;
setIsAtBottom(atBottom);
});
// Effect to auto-scroll during streaming
createEffect(() => {
if (isStreaming() && isAtBottom()) {
requestAnimationFrame(() => {
scrollContainer.scrollTop = scrollContainer.scrollHeight;
});
}
});
// Effect for when streaming completes
createEffect(() => {
if (!isStreaming() && wasStreaming) {
// Aggressive follow-up scroll
setTimeout(() => {
if (isAtBottom()) {
scrollContainer.scrollTop = scrollContainer.scrollHeight;
}
}, 100);
}
});
This code actively fought the browser’s native scroll anchoring. Every time we called scrollTop = scrollHeight, we triggered a scroll event, which updated isAtBottom, which potentially triggered another scroll. The jitter came from our code and the browser’s anchoring competing for control.
The New Architecture
Our scroll code now does almost nothing during streaming:
// On send: one scroll to position user message
function onMessageSend() {
const userMessage = document.querySelector(".latest-user-message");
scrollContainer.scrollTop = userMessage.offsetTop - 80;
// Then: nothing. Browser handles streaming.
}
// On history load: preserve position
function onHistoryLoad(previousScrollHeight: number) {
const delta = scrollContainer.scrollHeight - previousScrollHeight;
scrollContainer.scrollTop += delta;
}
// During streaming: nothing
That’s the entire model:
- On send: Scroll user message to top of viewport (our one custom behavior)
- During streaming: Do nothing. Browser handles content growth.
- On history load: Adjust scrollTop by the height delta of loaded content.
The Sticky Composer Pattern
One detail we adopted from ChatGPT: the composer lives inside the scroll container with position: sticky; bottom: 0.
<div class="scroll-container">
<div class="messages">
<!-- messages here -->
</div>
<div class="composer sticky bottom-0 bg-background">
<!-- input here -->
</div>
</div>
This eliminates the need for JavaScript positioning of the input area. It stays at the bottom of the viewport automatically, and users can scroll past it to see older messages.
One gotcha: the composer background must be 100% opaque. We initially had bg-background/80 (80% opacity with backdrop blur) and messages were visibly peeking through underneath.
The Monotonic Buffer
We added one enhancement ChatGPT doesn’t have: a “monotonic buffer” at the bottom of the message list.
When the user sends their first message, we add approximately 50vh of bottom padding. This lets users swipe the newest messages up to eye level rather than having them stuck at the screen’s bottom edge.
The critical insight: once added, we never remove this buffer. If we toggled it off after streaming and the user had scrolled into that space, removing it would cause a massive layout shift. Making it monotonic (add once, never remove) guarantees layout stability.
SolidJS-Specific Gotchas
One bug took us hours to track down. Our spacer height was intermittently 0, breaking the initial scroll positioning.
The culprit: a variable inside a <Show> block.
// Bug: hasScrolledInitially resets on every Show re-render
const MessageComponent = () => (
<Show when={optimisticMessage()}>
{(() => {
let hasScrolledInitially = false; // Resets when signal updates!
return <div />;
})()}
</Show>
);
In SolidJS, the callback inside <Show> re-runs when the condition’s underlying signal updates. Our hasScrolledInitially flag kept resetting to false, causing duplicate spacer creation.
Fix: track state at component level, outside the Show scope.
let spacerCreatedForMessageId: string | null = null; // Component level
const MessageComponent = () => (
<Show when={optimisticMessage()}>
{(() => {
if (spacerCreatedForMessageId === currentMessageId) return null;
spacerCreatedForMessageId = currentMessageId;
return <div />;
})()}
</Show>
);
This is a general SolidJS pattern: variables inside reactive blocks have different lifecycles than you might expect coming from React.
The Results
Before:
- Hundreds of lines of scroll management code
- RAF loops, event listeners, effect chains
- Subtle jitter during streaming
- Fighting the browser’s native behavior
After:
- ~20 lines of scroll logic
- One scroll on send, one adjustment on history load
- Smooth streaming (browser handles it)
- Premium, “boring” feel
The less we do, the better it feels. Browsers are really good at scrolling. Let them.
Debugging Tips
If you’re investigating scroll behavior, here’s what we learned:
Use console logs, not screenshots. Screenshot timing is too slow to catch mid-stream behavior. Add targeted logging:
console.log("[DEBUG] scroll state", {
scrollTop: container.scrollTop,
scrollHeight: container.scrollHeight,
clientHeight: container.clientHeight,
distanceFromBottom:
container.scrollHeight - container.scrollTop - container.clientHeight,
});
Hard refresh between tests. Vite’s HMR can leave multiple module versions running simultaneously, causing bugs that don’t exist in production. Use Cmd+Shift+R.
Intercept scroll APIs. The interception pattern above immediately tells you whether scroll issues are caused by your code or browser behavior.
Takeaways
-
Check browser defaults first.
overflow-anchor: autohas existed since 2017. We spent a week building what CSS does automatically. -
Reverse-engineer the competition. Five minutes with Chrome DevTools told us more than hours of speculation.
-
Simpler is usually better. Our “smart” scroll code was the problem. Deleting it was the fix.
-
Respect the platform. Browsers have decades of optimization behind scroll physics. Fighting them creates jitter. Working with them creates smoothness.
The premium feel we were chasing didn’t come from adding clever code. It came from removing the code that was getting in the way.
Building a chat interface? I’m happy to share more details. Reach out anytime.
— Peyton