Battling bottlenecks

Where do they get their figures? According to Zona Research in the US, the likelihood of a customer leaving a website is just 7% if the page appears in under seven seconds, 30% if it takes eight seconds or more to load and 70% if the clock ticks past 12 seconds.

Where do they get their figures? According to Zona Research in the US, the likelihood of a customer leaving a website is just 7% if the page appears in under seven seconds, 30% if it takes eight seconds or more to load and 70% if the clock ticks past 12 seconds.

But of course we’re talking cheap, ubiquitous, high-speed access and some really big sites. One, for example, DiscoverMusic.com, serves 60 million to 70 million audio streams a month; not the sort of volume for your typical New Zealand site to have reached. Here the bottleneck problem has a somewhat different, perhaps uniquely Kiwi, edge.

Still, Simon Greenwood, managing director of Greenwood Technology (GTL), in Auckland, says the perception that the process may be slow is sometimes a bottleneck in itself. It forms a barrier to the uptake and acceptance of e-commerce, especially in the B2B arena.

“People are used to LAN speeds. Unless you can afford faster telecommunications pipes there’s a barrier there in that smaller businesses may have to spend a bit more money to make e-commerce worthwhile.”

Fortunately the comparatively recent emergence of cost-effective high-speed access is starting to make those attitudes a thing of the past.

As a firm specialising in internet-architected applications (IAAs), bottleneck bashing at Greenwood starts when the code is written.

“The time taken to complete a client request can be broken down into numerous stages: sending the request to the server, the server handling it then forwarding it to the database, the database responding, then the server sending that response back down the pipe to the client. By using profiling tools at the development stage we can pick out the sluggish spots server-side and work on them before the site ever goes live.”

But that doesn’t preclude regular monitoring. “There are two parts to this process. Making sure the application’s been coded in an efficient way and then actively monitoring the server to see how the site’s actually handling the volumes.”

Obviously the frequency of that monitoring depends on the site itself. Some only require a look-see now and then, others like DiarySmart, demand eternal vigilance and forward planning. After all, being a soaring success is one thing: being wiped out by too much volume quite another.

“If you’re delivering database content to the web,” Greenwood stresses, “you’ve got to have a robust database management system.” Here, server consolidation is important. As the DiarySmart experience shows, the ability to upscale one or more databases is then simply a matter of adding more processors and memory to the centralised environment.

GTL is a Unix shop. It has a cluster of Sun boxes running Solaris and the open source Apache web server along with Oracle. All applications are Java-based for maximum portability. “Unix running on some of the bigger iron gives us very stable e-commerce sites,” Greenwood says. This environment allows it to load balance its application servers and the system works well.

“You get your redundancy through multiple servers serving up one or more applications each, and all load balanced.” After all, “You can’t afford to have the site go down.”

Join the newsletter!

Error: Please check your email address.

More about ApacheLANOracleTechnologyZona Research

Show Comments
[]