How Much JS a Modern Website Actually Needs to Work

How Much JS a Modern Website Actually Needs to Work

Websites are getting fat. It’s a problem everyone feels but nobody seems to know how to fix without breaking their entire tech stack. If you open the network tab in Chrome on a random news site, you'll probably see several megabytes of JavaScript flying across the wire. It’s wild. Most of that code isn't even doing anything for you, the user. It’s just... there.

How much JS a site really needs depends on what you're trying to build, but the industry has drifted far away from what is actually efficient. We’ve traded user experience for developer convenience. It’s easy to npm install a massive library to handle a simple dropdown menu. But your users pay the "JS tax" in battery life and load times.

👉 See also: Why the Alienware Area-51 Desktop Still Matters for Gamers and Collectors

The Brutal Reality of Browser Parsing

Most people think a 500KB image and a 500KB JavaScript file are the same. They aren't. Not even close. When a browser downloads an image, it just has to decode it and splash it on the screen. When it hits a massive JS bundle, the main thread locks up. It has to parse the code. It has to compile it. Then it finally executes it.

During that time? Your site is a ghost. Users click buttons and nothing happens. It’s frustrating.

Alex Russell, a partner product manager at Microsoft and a loud voice in the performance community, has been sounding the alarm on this for years. He argues that for a site to be fast on a median mobile device—think a budget Android phone on a spotty 4G connection—you only have about 130KB to 170KB of "budget" for your entire initial load. That’s tiny. Most modern frameworks like Next.js or Nuxt.js can eat up half of that just by existing.

📖 Related: The Man in the Sea Program: Why This Forgotten Underwater Experiment Still Matters

Why 1MB of JavaScript is the New Normal (and Why It Sucks)

If you look at the HTTP Archive, the median mobile page now sends over 400KB of JS. That’s the median. Plenty of "enterprise" sites are rocking 1MB to 2MB. Why?

It’s the "Dependency Hell" effect.

You want a nice calendar picker? That’s a library. You want some slick animations? That’s another one. Throw in Google Analytics, the Facebook Pixel, HubSpot tracking, and a chatbot, and suddenly your "simple" site is heavier than a 1990s desktop application. Honestly, third-party scripts are often the biggest villains here. You can optimize your own code until you're blue in the face, but if marketing insists on five different tracking pixels, your performance is toast.

The "Framework" Tax

We love React. We love Vue. They make building complex UIs much easier. But they come with a baseline cost. Even a "Hello World" in React is going to pull in about 30KB to 40KB of minified code. Then you add React DOM. Then you add a state management library like Redux or MobX.

Before you’ve even written a single line of your own logic, you’re already halfway to your performance budget.

There’s a shift happening, though. People are starting to realize that maybe we don't need a massive SPA (Single Page Application) for a blog or a marketing site. That's where things like Astro or SvelteKit (with pre-rendering) come in. They try to ship zero—or very little—JS to the browser unless it's absolutely necessary. It’s a "use only what you need" philosophy.

🔗 Read more: Why the UFO Detector UAP Detector ET302W is Actually Used by Serious Skywatchers

Real World Examples: High Stakes Performance

Look at Amazon. They famously found that every 100ms of latency cost them 1% in sales. For a company that size, that's billions. They don't use heavy frameworks on their search results pages for a reason. They use highly optimized, often vanilla JS, because every byte matters.

On the flip side, look at something like Google Docs. That is a massive, JS-heavy application. And it has to be. You can’t build a real-time collaborative word processor with just HTML and CSS. In that case, 2MB of JS is actually reasonable. The context matters more than the raw number.

The problem is when a local bakery website uses the same amount of JS as a complex productivity tool. That's just bad engineering.

How Much JS a Site Actually Needs: A Better Way to Measure

Instead of looking at the total file size, experts like those at WebPageTest suggest looking at "Total Blocking Time" (TBT) and "Interaction to Next Paint" (INP). These metrics tell you if the JS you are shipping is actually hurting the user.

If you have 500KB of JS but it's all deferred and doesn't block the initial render, you might be okay. But if you have 100KB of JS that sits at the top of your `