MIT algorithm for page speed

Internet

From Slashdot today:

MIT researchers have created an algorithm that analyzes web pages and creates dependency graphs for all network resources that need to be loaded (CSS, JS, images, etc.). The algorithm, called Polaris, will be presented this week at the USENIX Symposium on Networked Systems Design and Implementation conference, and is said to be able to cut down page load times by 34%, on average. The larger and more resources a web page contains, the better the algorithm’s efficiency gets – which should be useful on today’s JavaScript-heavy sites.

As one would expect from the Slashdot crowd, the highest-voted comment was that we already had this, and it’s called AdBlock Plus. The second one called out NoScript. And they’re right.

An algorithm implemented to reduce page load times would have far-reaching, positive effects. But it attacks the symptoms of bloated pages, not the cause.

I made the mistake of using a browser without NoScript recently, and was stunned at how sluggish every site I went to was. Fonts suddently changing once they’re downloaded. Content injected from trillions of places. Bloated JavaScript libraries and trackers. Auto-playing movies of no value. Maybe most people are desensitised to it, but I found the experience absolutely horrible.

Author bio and support

Me!

Ruben Schade is a technical writer and infrastructure architect in Sydney, Australia who refers to himself in the third person. Hi!

The site is powered by Hugo, FreeBSD, and OpenZFS on OrionVM, everyone’s favourite bespoke cloud infrastructure provider.

If you found this post helpful or entertaining, you can shout me a coffee or send a comment. Thanks ☺️.