Why Do Websites Keep Getting More Bloated Every Year?

Although my Internet connection is literally thousands of times faster than my old dial-up connection decades ago, I often feel like websites still take just as long to load.
The obvious reason for this is that as the size of the old Internet channel has increased, so has the volume of data that constitutes a typical website. Even a basic browser website like Wikipedia takes up a good chunk of memory, for no obvious reason at first glance. So what’s going on?
From light to puffy
When all you had (in a perfect world) was 56 Kbps of bandwidth, your website needs to be simple and efficient. The first sites were kilobytes in size and consisted mainly of text. When the images were there, they were low resolution JPEG files with aggressive compression. I once downloaded a 64MB music video to my dial-up modem, and it took me an entire weekend to finish it. So video integrated on a site? Forget it.
It’s this old paradox that seems to affect real-world highways as well. No matter how many roads, bypasses and overpasses you build and how many lanes you add, you still face traffic jams. It’s probably the same psychology and economics that are behind the strange phenomenon of energy-saving lamps. Instead of reducing our electricity bills, we simply add more light for the same energy cost!
The number one culprit is simply the richer media. We have high-resolution screens on all of our devices and so we need high-resolution images if we want a site to look good. Modern image formats like WEBP have helped to some extent in reducing file sizes while maintaining high quality, but the general trend is towards more images with more pixels, which bloats things.
If it were just still images that would be one thing, but a typical website offers auto-playing videos, audio files, animated GIFs, and lots of rich animated media based on the technology of the website itself, like HTML5.
Frameworks, Libraries and Code Overload
A modern website is not something that a person sits down and codes in plain HTML like in the old days. Just like coding an application, web developers rely on extensive libraries and sophisticated tools to help them create these sites quickly and efficiently. However, this leads to a situation where sites are overburdened by reliance on these libraries and frameworks, since the site developer relies on them even for relatively simple things.
Over time, sites can simply accumulate code as it changes and updates. Old code, unused CSS, outdated plugins and debugging scripts are left behind. Which does not affect the site’s functions, but consumes your bandwidth.
Advertising, tracking and analytics
When it comes to making sites richer in user experiences, there’s some justification for this overhead, but what about when it’s just about making someone richer? Well, the truth is, of course, that websites cost money to build and operate, and they have to make money. Unless users pay directly for these services, the main way to sustain a website is through advertising.
In the early days of the web, a site might have a single banner ad at the top and then again at the bottom of the page. Endless scrolling didn’t exist yet, which is another reason for bloat!
There are now entire advertising systems built into websites that, again, only exist because, in general, people don’t seem willing to pay for content even if they find it valuable. However, there is no denying that ads, trackers, and analytics weigh down websites. Necessary evil or not.
The problem of feature creep
The last big reason for all this bloat, at least in my opinion, is the extreme feature creep that has taken over websites. Sites used to be a static information page, but now it’s like sites are trying to be every type of web application for every user.
Chat widgets appear when you don’t want them, software running in the back monitors everything you do, plus notifications and even more notifications. When every website tries to put TikTok or YouTube type elements on it, it doesn’t work, and then it doesn’t get cleaned up properly later, you end up with a graveyard of failed “improvements.”
Why it matters
We like to complain when our software isn’t optimized, because that means spending money on faster processors and more memory just to stay in the same place. So why not make the same complaint against websites, which are now software applications broadcast live via the web? It’s not just a problem of confusing, ugly websites or long loading times.
Overloaded websites consume data on limited connections, clutter the Internet for everyone, use more power, and increase hardware costs. The Web continues to grow, not because it has to, but because it can. Faster pipes have made developers complacent, letting complexity grow unchecked. It will take a performance reset as a core design value to improve things, but would you accept a cleaner web?


:max_bytes(150000):strip_icc()/Health-GettyImages-1447128836-95118be407794c038cd982dc2ed8f807.jpg?w=390&resize=390,220&ssl=1)

