When designing new web pages or working on new features for an existing page, developers commonly consider the following factors:
- Page appearance for different layout sizes
- Browser-specific implementations
- Page behavior with and without JavaScript enabled
All of the above points are valid, and developers should indeed think about them. However, there is another factor that developers might not think much about: how their pages are loading on low-bandwidth connections or slower computers.
When users browse web sites, numerous factors contribute to the speed at which web pages load. For example, some users might have high-bandwidth connections but slower computers, while others might have high-speed computers with low-bandwidth connections. Due to such factors, users of your web site can have vastly different page loading experiences.
Most web sites collect real-time site speed data to see how their web pages are loading on their users’ computers. Typically, these numbers provide median, average, and 95th-percentile loading time. If you look closely into the 95th-percentile data, you might be amazed how slowly your web pages are loading. Can we do anything about these 95th-percentile users?
To analyze your users’ bandwidth experience, collect and segregate data based on the load time of your web pages. Next, determine what percentage of your users are experiencing slow load times. For this determination, it’s up to you to draw the line between fast users and slow users.
Suppose that your segregated data looks like this:
Time |
<1sec |
<2sec |
<3sec |
<4sec |
<5sec |
<6sec |
<7sec |
<8sec |
<9sec |
<10+ sec |
Users |
32% |
64% |
77% |
85% |
89% |
92% |
93% |
94% |
95% |
100% |
According to the data, pages are loading in less than 4 seconds for 85% of your users. Let’s say that you determine that slow loading is anything more than 4 seconds.
Now we have a data point to distinguish between slow and fast load times. But how do we detect load times in real time, per user? One approach is to download an image of X bytes, measure the time taken to load the image, and use that time to calculate the user’s bandwidth. However, this approach might not touch all factors—such as JavaScript execution time—that can affect page loading time.
Here is a different approach, which would touch all possible factors for slowness. When the user visits a web page for the first time, at the start of the session calculate the load time of the page and store it in a cookie. Do this calculation only one time, and use it for the rest of the session.
Calculating page load time is very simple. The following script measures purely client-side load times. (Note that it assumes that you have JavaScript functions to read and write cookies.)
<HEAD>
<SCRIPT> var start = new Date().getTime(); //Take the start measurement </SCRIPT>
</HEAD>
<SCRIPT>
(function(){
window.onload = function(){ //Use appropriate event binding mechanism
var end = new Date().getTime(); //Take the end measurement
var loadtime = end-start; //Calculate and store the load time
var loadtimefromcookie = readCookie(); //Call JS function to read cookie
if(!loadtimefromcookie){ //If load time not yet stored
writeCookie(loadtime); //Call JS function to set cookie
}
}
})();
</SCRIPT>
From the second request onwards, you can read the cookie; based on its value, you can strip down the features for lower-bandwidth users and/or add more features for higher-bandwidth users.
How can you improve the loading speed for lower-bandwidth users? Here are some techniques:
- Show fewer results.
- Use smaller images.
- Cut down on heavy ads, or remove all ads.
- Remove heavy or crazy features that are not required for basic functionality.
Do A/B testing with these changes, assess the gains for the 95th-percentile user, and see whether this approach makes a difference. Once you see positive changes, you can target and fine-tune page features based on user bandwidth.