
The desire to shave milliseconds off JavaScript build times has been relentless, but progress has been slow. Recently, several companies have stepped up to address this challenge by supercharging their JavaScript bundlers: Vercel, a cloud platform; VoidZero, Evan You’s startup focused on JavaScript ecosystem infrastructure; and ByteDance, the consumer services giant best known for TikTok. I’ve written about JavaScript Delivery extensively, but with recent announcements including Vite+, [Rspack 1.6](https://rspack…

The desire to shave milliseconds off JavaScript build times has been relentless, but progress has been slow. Recently, several companies have stepped up to address this challenge by supercharging their JavaScript bundlers: Vercel, a cloud platform; VoidZero, Evan You’s startup focused on JavaScript ecosystem infrastructure; and ByteDance, the consumer services giant best known for TikTok. I’ve written about JavaScript Delivery extensively, but with recent announcements including Vite+, Rspack 1.6, and the Next 16 (beta) release enabling Turbopack by default, it’s worth chatting specifically about not only the current state of bundlers, but also why DevTools companies are throwing so much money at them.
JS module bundlers organize code and its dependencies by merging numerous files into one optimized package. Bundlers streamline loading in the browser, and lately building a faster one has become the Formula 1 of JavaScript in the sense that well-funded teams pour enormous resources into marginal gains on what is essentially a standardized chassis. Given how much time developers waste waiting for builds, the investment is reasonable. But there’s an irony here. Huge amounts of energy have gone into improving developer experience, while user experience—the actual runtime performance of what ships to the browser—remains an afterthought. Bloated JavaScript is still very much a user problem, meaning that the real trophy isn’t faster builds; it’s smaller artifacts, less unused code, and bundles that don’t punish end users with sluggish page loads. That shift will likely require bundlers and compilers to work more closely together, enabling smarter cross-module optimization. In other words, the tournament for faster builds may eventually give way to a more meaningful contest—one measured not in developer wait times, but in the performance users actually experience.
Bundle Nation
There are a wide range of bundlers available in the JS market, from established tools to next-generation options. Webpack remains the go-to choice for large, complex applications, while Rollup is favored by many library authors for its efficient tree-shaking and flexible export formats. Browserify, introduced in 2011, paved the way by enabling Node-style modules in the browser, and lightweight solutions such as Microbundle continue to serve developers seeking simplicity for small libraries. Faster, modern alternatives include ESBuild, a high-performance bundler written in Go; Parcel, known for its zero-configuration setup; and Rolldown, a Rust-based project from VoidZero offering Rollup-compatible APIs. Beyond single-purpose bundlers, all-in-one toolkits have emerged that combine bundling with runtimes, package managers, and test runners. VoidZero’s Vite has attracted many developers owing to its seamless development experience built around native ES modules and hot module replacement, while Anthropic’s newly-acquired Bun—written in Zig and powered by JavaScriptCore—bundles a fast JavaScript runtime, bundler, package manager, and test runner into a single executable. On the Rust-powered front, ByteDance’s Rspack and Vercel’s Turbopack represent the latest wave of high-speed bundlers targeting enterprise-scale applications.
What emerges from this crowded landscape is a clear pattern. The ecosystem is a mix of open-source community projects and vendor-backed tools, each building on or reimagining the work of its predecessors. Rolldown extends Rollup’s ideas; Rspack bills itself as a faster Webpack alternative. And increasingly, the newest generation of bundlers is being rewritten in Rust (or Go), trading JavaScript’s flexibility for raw speed (more on Rust below). The throughline connecting all of these efforts is the same: faster build times.
The JavaScript bundler ecosystem reflects not just innovation for its own sake but a response to mounting pressure from developers frustrated by slow, bloated builds. The motives for investing in bundlers are numerous: productivity, speed, and the ability to tame JavaScript’s ever-growing complexity. As web projects exploded in size and scope, bundlers became essential to managing the influx of dependencies and files. They’re the natural consequence of dependency sprawl that has plagued the JS community since the SPA era’s inception. In some ways, the industry’s ongoing concern with faster build tools may be seen as a kind of bandage over the deeper structural issues of JS-heavy frameworks.
Benchmark Blur
Benchmarks are unreliable. Always have been, which is not to say vendors don’t spend huge amounts of money trying to demonstrate speed. The years of TPC-C database benchmarks saw an entire industry coalesce around trying to perform better in these industry standards. That said, I am interested in the metrics vendors cite to show improvement around their bundlers. The Next.js team at Vercel, for instance, has spent years optimizing its bundler. They also hired top talent, notably Tobias Koppers, creator of Webpack, to work on the problem in-house.
Lately, this investment seems to be paying off. According to Vercel’s blog, this stable release of Turbopack boasts “up to 5-10x faster Fast Refresh, and 2-5x faster builds.” Sounds impressive, but if you’re a developer shopping for a bundler you’ll soon discover (as I have) that the pace and benchmarking of improvement is murky because the competition boasts similar gains. At ViteConf 2025, for example, Evan You boasts of 33% faster builds for the latest release of Rolldown, Vite’s bundler, “running against the same benchmark 10,000 React components.” Rspack’s latest release, Rspack 1.6, claims to be “11% faster” compared to the last version. Drama around whether to trust these claims aside, the trend is clear: bundlers, we are told, are getting faster and faster.
In addition to benchmarks provided by the vendors themselves, (see Farm, Vite), Developers are also putting these bundlers through their paces. Some provide receipts showing faster builds in their own repos to explain why they opted to switch, such as this article by Honza Hrubý, Staff Frontend Engineer at Mews: “Goodbye Webpack, Hello Rspack (and 80% Faster Builds).” Others are still seeking the perfect general solution. Kyle Gill, Software Engineer at Particl, for example, starts with a question, “Is Vite faster than Turbopack?,” and ends with “Well, it depends.”
The problem with relying on benchmarks is that new versions ship constantly—what’s fastest today may not be tomorrow, and results vary wildly depending on the use case. This makes it nearly impossible to crown a definitive winner in the benchmark race. Therefore, rather than chase the latest numbers, it’s more useful to understand the underlying conditions that are actually driving this obsession with performance.
Why everyone is losing their mind about bundlers
I spoke with a lot of people to try and suss out this situation—some on the record, some off—and here’s what I discovered. The “how” behind speed improvements in JS bundling is pretty consistent across these tools: Rust. Rust gives bundlers native-speed throughput by using direct, predictable memory management and zero-cost abstractions instead of garbage-collection pauses. The fewer runtime overheads, the faster the tool can transform, hash, and emit assets. Many bundlers today are either written in Rust, or have been recently re-written in it, with ESBuild’s choice of Go and Bun’s use of Zig being notable exceptions.
It’s the “why” JS bundlers are so important today that has been contentious. Sure, faster is better, but is it really worth the investment? Here’s what I learned: bundler-furor is the result of outsized JS artifacts. There’s too much JS.
Tobias Koppers has reflected on the future of bundlers in the past, particularly in his 2022 JSNation presentation “Webpack in 5 years,” in which he explains that the loss of ecosystem is a huge deterrent to rewriting Webpack. But that was 3 whole years ago! When I interviewed Koppers this year, I asked him to reflect on how JavaScript bundlers have evolved since he created Webpack, and specifically what new demands he sees from web developers and in the JavaScript ecosystem now that he’s working on Turbopack. Koppers explained:
Turbopack was mostly created out of the frustration we had with Webpack, like squeezing out the last performance of that. It’s 10 years old, so it wasn’t built for the large scale applications we have nowadays, and they are still growing. We basically started with a completely different architecture with Turbopack. We try to split up everything into really small tasks, to build up like a task graph, to make incremental computation really fast.
Koppers’s point highlights a fundamental shift in how modern bundlers are conceived. Where Webpack once aimed to handle everything through a single, monolithic pipeline, today’s tools prioritize distributed, incremental computation that can scale with massive codebases and multi-framework ecosystems. His emphasis on “small tasks” and “incremental computation” reflects a broader trend toward modular architectures that respond instantly to changes, echoing the needs of developers building ever-larger applications across diverse environments. In essence, the evolution from Webpack to Turbopack isn’t just about speed, it’s about reimagining how build systems adapt to the ever-expanding complexity of the web.
Zack Jackson, an engineer at Bytedance working on the Rspack team and the co-creator of Module Federation, had this to say about JS bundlers in 2025:
A lot of this has been a race to the bottom on speed. Yeah, we’ll obviously want to make Rspack faster, and we continue to do so, but we also know that if you can build production in one minute here versus three minutes here, there’s virtually no business value, all the dollars essentially dry up once you’re within that habitable zone. And I think a lot of places are still trying to compete on the speed when, you know, and if we’re looking at most builds, we’re talking about one taking one minute, and one taking 30 seconds. And those are pretty large apps and developments, obviously super fast. So really, I think the whole speed issue as a whole is just negligible now.
Now that that’s kind of dried up, and we’re all well, within speed of each other, where’s the next kind of war going to be fought? So in my opinion, and it has been for a while, but I think that probably the one of the big areas where this bundler war is going to move to is probably going to be around artifact size… Now this is kind of a dirty secret, and in bundling nobody’s addressing the fact that half the code in every code base is not used, and it purely comes from Terser, basically it was too slow to optimize each file, and some of this is a case of the optimization algo not integrating deeply with cross module analysis. Originally because perf was prohibitively slow. And also the optimizer being owned by some other group from the bundler. Something Rspack and SWC now work much closer together on so that optimization is more intelligent and capable of deeper integration within the bundler.
Jackson’s argument reframes the so-called “speed wars” as largely settled territory. Once build times reached the point of diminishing returns—measured in seconds rather than minutes—the real differentiator shifted elsewhere. The next frontier for bundlers lies in optimizing what actually ships to users: reducing artifact size and eliminating the enormous amount of unused code that still bloats modern bundles. This, he argues, will require deeper cooperation between bundlers and compilers, enabling more intelligent, cross-module optimization. In short, the battle for faster builds is giving way to a more meaningful contest—one focused on efficiency, integration, and the tangible performance users experience in production.
Conclusion
The JavaScript bundler Grand Prix reveals a deeper truth about modern web development: we’ve been optimizing lap times while ignoring the weight of the car. While Rust-powered tools have delivered impressive speed gains—shaving build times from three minutes to thirty seconds—the real problem remains the sheer volume of JavaScript we’re shipping. Jackson’s observation that “half the code in every code base is not used” should be a wake-up call. We’ve built increasingly sophisticated pit crews to process bloated applications faster, when perhaps we should be asking why our cars became so heavy in the first place. The Formula 1 comparison feels apt not because of the urgency, but because of the irony—we’ve mobilized enormous engineering resources to solve a problem with diminishing returns.
Yet there’s reason for optimism as the race shifts from raw speed to aerodynamic efficiency. As bundlers reach performance parity, the focus naturally moves to what actually matters for end users: smaller bundles, faster runtime performance, and more intelligent code elimination. The architectural evolution Koppers describes—from monolithic pipelines to distributed, incremental computation—suggests bundlers are becoming smarter, not just faster. When tools like Rspack and SWC collaborate more deeply on cross-module analysis, when bundlers can finally strip out that unused 50% of dead weight, we’ll see real improvements in user experience rather than just shaving milliseconds off developer build times. This is the drag reduction that actually matters.
The bundler ecosystem’s maturation mirrors the broader JavaScript community’s evolution from “move fast and break things” to “build sustainably at scale.” The fact that companies are pouring resources into these tools isn’t just about reducing lap times—it’s recognition that bundlers have become critical infrastructure for the modern web. As these tools converge on similar performance characteristics, the podium won’t be determined by who posts the fastest qualifying time on synthetic benchmarks, but by who can deliver the most intelligent, maintainable, and genuinely useful developer experience while producing the leanest possible production artifacts. The bundler wars may be approaching their checkered flag, but the race for JavaScript sanity has just begun.
Header image: The Fairmont Hairpin at the Monaco Grand Prix.