The real challenge of web design is to design websites with their end-users in mind. This involves designing from the point of view of “user experience”. When someone visits a website for the first time, and finds all the information they need without confusion, and can then carry out all their tasks without failing: that’s when we can say that the site has a “great user experience”.
This user experience, as was explained in the second of these four articles, is central to web design, which is driven by the principles of User-Centred Design (UCD). Your customers are your website users – and your website “works” only if they judge that it does. Experience tells us that users seldom give poor sites a second chance — so designing sites with customers’ needs as the number-one priority makes good sense.
In this article we will look at some of the methods that can be used to implement UCD in real websites. To do this we’ll consider some case studies drawn from the experience of Wired Internet Group, a Christchurch-based company striving to promote the principles of UCD in the local web industry. These case studies will reveal some of the “real world” usability issues that may be encountered, and the sometimes simple steps that you can take to remedy them, and give your site the user experience your customers deserve.
There are a number of different approaches to investigating online user experience. Four that are commonly employed by staff of Wired Internet Group are:
• User experience assessment;
• Enquiry analysis;
• Card sorting;
• Usability testing.
The first two approaches focus on using expert consultants to review and evaluate sites according to best practice principles, while the last two involve working with actual web users in test environments.
In this article we’ll look at how the first two evaluative approaches can add value to your design process.
User experience assessment
The user experience assessment (UEA) is a kind of expert review. These reviews (also known as heuristic evaluations) involve measuring sites against established rules of best practice. The UEA also adds some consultation with representative users to the review format.
The UEA is a four-stage process:
• Interview: a face-to-face, phone or email Q&A with the client to establish key site owner goals and concerns.
• Evaluation: an appraisal by a UCD consultant to measure the site against established user-design principles.
• Consultation: up to three consultant-observed user sessions to identify user concerns.
• Report: summary of findings with practical recommendations for improvement.
Wired staff did a UEA on the site of a business association. The overall findings from the evaluation and consultations were consistent, and largely matched the expectations evident in the interview with the association’s web staff.
The main concerns identified were:
While the site’s visual design was generally well-liked, users were confused by the fact that it had two navigation menus (at the top of page and on the left side).
The overall Information Architecture was neither clear nor intuitive. Users did not know where to look for specific information, even the most essential content was hard to find.
Much of the content of the site was poorly written and did not communicate effectively.
As a result of the first three findings, users were relying heavily on the search function, but because the headings and page content were poorly worded, this also failed to meet their needs.
Based on these findings, we were able to recommend that the navigation be unified under a new and more intuitive set of headings, and that all pages be revised to start with a topic sentence stating clearly what the page contained. This instantly makes search engines more effective, in fact these kinds of improvements are often marketed under the heading “search engine optimisation”, but they are generally more effective when combined with a comprehensive UCD approach to site design.
The site owners were relieved that they did not have to scrap the entire site and they were able to implement the recommended changes quickly and cheaply. There was a sharp decrease in calls to their staff asking for information that was actually included in the site, which everyone could see was a great saving in time and money. Association members were surveyed and expressed increased levels of service satisfaction.
The Enquiry analysis (EA) involves expert analysis of a sample of online customer email enquiries — the kind that customers send in using the “Contact us” page on your site. Getting a usable sample of enquiry emails is cheap and easy, and good results can be obtained from just a couple of hundred.
This process of analysis identifies the broad types of enquiry received, and thus the concerns that your online customers find important enough to email in a question or complaint. One of the most consistently revealing findings in an EA answers the question: “which information and services are available on the website, but can’t be found by users?”
The EA involves the following steps:
• Collection – the client’s web or customer service manager makes up a file of the sample enquiries, a simple spreadsheet will do.
• Analysis – Wired staff break the sample down by categories and identify trends or concerns.
• Report – A written summary of findings, with conclusions and action-oriented recommendations.
We undertook an EA for a major consumer brand because of the large number of enquiries the site was receiving. Even though the corporate site had been in operation for a couple of years, there had been no drop off in customer enquiries to either the call centre or the web master. This was a sign the website was confusing its users and not giving them the information they are looking for. We used a month’s worth of email enquiries, which in this instance was a meaningful sample size.
The pattern of subjects enquired about showed that some parts of the website were not answering users’ frequently asked questions. In 60 percent of enquiries, the information the customers needed was there, but they simply could not find it.
The enquiry form itself was badly designed. It was failing to categorise enquiries and thus making unnecessary sorting work for staff deciding who could answer the queries.
As a result of both the number of repeated enquiries, and those being misdirected to the wrong division, there were unacceptable response delays and customer dissatisfaction.
Based on this, we recommended that specific pages be renamed and content rewritten. To enable useful categorisation of enquiries, the “Contact us” page needed to be redesigned, so that clear enquiry categories were all simultaneously visible on the page (previously most enquirers selected the first option from a dropdown menu, because that was the only one visible).
We also recommended design of a special form for those seeking employment to upload their CVs direct to the HR department, thus eliminating the double-handling of nearly 20 percent of all the current enquiries by contact centre staff.
Finally, to establish site credibility, we urged the company to place a statement regarding expected response time to customer enquiries on the “Contact us” page – and to ensure resources were available to meet the promises made. Where no promise is made regarding enquiry response times, the customers are likely to conclude that the company simply doesn’t bother with customers’ enquiries.
Once implemented, these recommendations led to a substantial reduction in the number of enquiries, and increase in satisfaction levels surveyed, and a measurable reduction in the time required to satisfy customer wants.
The simple truth was that site users could not access information the site owners thought they had made available, and company staff were being made to do pointless work that prevented them from devoting all their time to building the company’s brands in the marketplace.
Russell lectures at Christchurch Polytechnic, where he is the programme leader in the Graduate Diploma of Information Design. He consults with Wired Internet Group. Contact him at firstname.lastname@example.org