Faster Websites With Fewer Bugs

Share

Faster websites is an essential factor for its success. Loading a web page generally, involves a database query. But these database queries are time-consuming. Thus, various faster websites store or caches for faster loading. In this process, the complex task is to analyze a website’s code to identify which operations necessitate updates to which cached values generally falls to the web programmer.

Scientists from MIT’s Computer Science and Artificial Intelligence Laboratory designed a new system for faster websites and web applications. This new system automatically handles caching of database queries.

The website handles many requests simultaneously. For example, sending different users different cached data and even data cached on different servers. Similarly, the system promises the same. It promises user that every transaction will look exactly like requests handled in sequence.

Scientists involved 2 websites in the experiment that were built in Ur/Web. After implementing this new system, they found that the new system’s automatic caching offered twofold and 30-fold speedups.

Associate Professor Adam Chlipala said, “Most very popular websites backed by databases don’t actually ask the database over and over again for each request. They notice that ‘Oh, I seem to have asked this question quite recently, and I saved the result, so I’ll just pull that out of memory.’

But the tricky part here is that you have to realize when you make changes to the database that some of your saved answers are no longer necessarily correct, and you have to do what’s called ‘invalidating’ them. And in the mainstream way of implementing this, the programmer needs to manually add invalidation logic. For every line of code that changes the database, the programmer has to sit down and think, ‘Okay, for every other line of code that reads the database and saves the result in a cache, which ones of those are going to be broken by the change I just made?

Actually, the new system is the modification of the Ur/Web compiler. It allows Ur/Web users to simply recompile their existing code to get all the benefits of database caching.

The compiler analyzes the code and determines what data to cache and how to organize it. It also decides whether to cache raw data, HTML code, or, entire web pages. Then, it compares every operation that updates a value in the database with every operation that queries the database. It then demonstrates which value required to invalidate. Next, it adds the appropriate cache invalidation commands in the appropriate places.

During the experiment, scientists also tested it on a handful of smaller programs. As result, the new system offers speedups of between twofold and fivefold.

Chlipala said, “Even if it turns out that someone could put in the extra work, get a tripling of the throughput. Our argument is that it’s a pretty good deal. You get a doubling of your throughput with no extra work.

Latest Updates

Trending