6 Key Points For Successful User Testing

How well do you know your web audience? You’ve probably got a fairly solid understanding of the types of people who visit your site and what they are looking to achieve, particularly if you are utilising Google Analytics. The value of Google Analytics is unquestionable. Personally, I believe it’s an absolute must for tracking traffic and focusing on how people are using your site. Google Analytics, however, can only provide so much information, and while these statistics are invaluable, to really get a clear idea of how your site is being used, proactive steps need to be taken.

One of the best ways to understand how your site is being used is through user testing. There is no better way to see how your site is performing and how people are actually using it than to study real users. User testing can be undertaken in a variety of ways, from simply watching friends use the site with a specific goal in mind to implementing a number of intricate tasks which are recorded and documented in detail. Below are some key points to ensuring you get the most out of testing your site.

  1. Select relevant users

    User testing can fail before it’s even begun if the right people are not selected for the testing. This may sound a little alarming, but it merely highlights the importance of thinking about the types of people who use your site. Thought and consideration needs to be applied when selecting subjects for user testing. The users being tested need to represent your web audience. Consider demographic, occupation, interests, age, gender, and anything else relevant to your site. Keep in mind, the results will only be as good as the people you test.

  2. It’s all about comfort

    Make sure the user is comfortable. So often user testing is conducted in an artificial environment, which the subject, most likely, will not be accustomed to. No one likes sitting a test, so people will naturally feel edgy. To achieve the most authentic results, the user needs to navigate through the site as they normally would, as if they were at work or home, without expectations or pressure.

    Allow the user to become relaxed in the environment, explain what they’ll be doing, the website they’ll be testing, and discuss what the subject expects would be on a website of this nature. Remember, all information is useful, regardless of how trivial it may appear at the time.

  3. ‘Think out loud’

    After getting the user comfortable in their surroundings, chat to them about the process. Encourage the user to ‘think out loud’, while executing the outlined tasks. When a user hits a road block, it’s useful to hear what they are thinking, why they are making certain decisions. While speaking out loud, be sure to inform the subject that it’s the system being tested not the user, there are no correct responses. It’s human nature to provide answers that the facilitator will want to hear, no one wants to look stupid.

    Pay attention to body language, in addition to what the subject is saying and doing on-screen. Experienced facilitators are trained to decipher body language; for example, if a user states that they clearly understood the directions and then proceed to look away from the facilitator, avoiding eye contact, it is evident that the user was slightly confused or had some difficulties comprehending.

    It’s important the subject does not feel under pressure to say the right things. The purpose of testing the site is to uncover issues with its performance in the hope of ultimately improving it.

  4. Set tasks

    Set tasks which, ideally, have a start and end point. To accompany the qualitative information, recorded through notes and video, it’s good to have some quantative data. If we can see that 5 out of 8 people could not execute a simple task, it is likely the site’s structure is deficient, and revised Information Architecture ought to be undertaken.

    In user testing, there is an industry standard for task success rates. Tasks are attributed one of three marks: (+1, 0, or -1).

    • +1 ) - for completing the task with no errors.
    • 0) – for completing the task with error(s)
    • -1) – for failure to complete the task.

    This marking scheme provides a guide as to how difficult or easy a specific task is.

    Analysing the above results page, we can ascertain that task 3 was quite difficult, as the majority of users failed to complete the task without any errors, suggesting that either the task was flawed or the site, itself, requires amending.

    Tasks ought to be direct instructions, providing the user with a clear, tangible aim. For instance, an example could be ‘please navigate to the contact us section, find the HR department number and complete the form and fill in a message.’

    It’s important to give each subject one task at a time. This allows the user to focus on a single task without distraction, meaning you have a better chance of receiving quality information about that particular issue.

  5. Record: the Mor(a)e detail the better

    Detail is the key to user testing. If you have enough available funds, it’s great to be able to record the testing with video technology. This is not absolutely necessary, as you can still obtain great information by simply viewing the subject, however, the benefit of recording your session allows the facilitator to be more relaxed, not having to worry about every minute detail, as there is always opportunity to go back through the video material. There’s a whole range of fantastic user testing recording programmes. My personal favourite is Morae, we use this technology at Wiliam and have received some great results with it.

  6. Thanks

    Please bear in mind that it can be quite a daunting prospect to be tested on something, so be sure to thank your subjects. Not only are they taking time out of their day, but they’re doing you a favour. It’s common to provide the user with a small gift as a sign of appreciation.

Happy user testing!