Up to 35% of website performance bottlenecks can only be found by external testing, according to a US company that specialises in the work.
Mercury Interactive makes its claim after "load testing" thousands of sites, a three-hour service it provides at $34,000 a time. No New Zealand websites have as yet used Mercury's services, says Sydney-based technical manager Peter Lilley.
According to Mercury’s tests, performance problems mostly arise outside firewalls and can be attributed to tuning and configuration issues with routers, gateways and switches, as well as bandwidth constraints and ISP peering point issues. The four main trouble spots with web data access are network bottlenecks, database tuning, application server configuration and web server configuration, the company claims.
The technology director of Wellington website designer Provoke Software, David ten Have, agrees that web servers and database tuning are common problems, but he discounts Mercury's assertion that network bottlenecks are common.
"In my experience I've always dealt with 'big pipes'," he says. "But if people are hosting a website behind a modem, network bottlenecks are definitely going to be an issue. So while it may have been a historic problem, it isn't any longer."
Ten Have says systems should be fine-tuned before going live.
"When you're transferring high volumes of dynamic data, often I've found that in your initial beta release you can spotlight a whole lot of problems with how you're using it. You can get significant gains in performance by just addressing how you've aligned your database, and how you're using it," ten Have says.
"That moves into the area of application servers, since the type of performance you get out the technologies you're using depends on how quickly they process the data eking out of the database."
Mercury’s Lilley says organisations that undergo six access tests and carry out recommended system tuning can expect an increase in performance quality of 400%.