14 Apr 2014

How to conduct user testing

By Vicky

A guest blog by Alice Tyler  from Open Utility.

So you’ve got this idea and you’re positive that people want it and will use it. But how can you know for sure? That’s where user testing comes in. Knowing that you’re building the right product or service for the right market is critical to a successful start up. This is known as Product Market Fit.

Open Utility is a BGV incubated start up that has spent the last year designing and testing their ideas. We’re now very close to reaching product market fit, and the way we got here was through copious amounts of user testing. Without this process we would likely have spent time building a product that people didn’t understand and weren’t interested in using. Keep reading if you want know how to conduct user testing, it’s something that we feel is a critical tool in any start-ups skill set.

What is user testing?

Simply put, it’s asking the potential end user of your product or service what they think. The way I see it, there’s two kinds of test:

  • Testing the idea – will people use it? (this is very hard to test)
  • Testing the execution – can people use it? (this is much easier)

Here are my top 5 points on how to do user testing:

 1) Test from day one: 

Get into the practise of testing early, it will reduce waste because you’ll learn what works for your users before you build it, ensuring you don’t build pointless functionality. The basics of user testing are easy and cheap:

  • Invite people who would be your intended users to the office or for a Skype / phone chat.
  • Plan to use no more than 1 hour of their time and check they are happy to be recorded.
  • Have a script prepared that takes the user through the flow you want to test.
  • As they complete the tasks you ask of them, get them to say out loud what they are thinking or looking for.
  • Make your own observations on their actions i.e. did they find that important button?
  • Thank them for their time and tell them it’s been really useful (even if it wasn’t)

 2) Test with ugly things

You might be nervous about showing users your unfinished design work but don’t be – firstly it’s important to test things before you build them, as well as after. The problem with pretty things is that people get easily distracted by colour. They will spend hours telling you how they didn’t like the blue you used instead of the important stuff like whether they found it useful. ‘Ugly’ means: hand drawn sketches on a paper (called a paper prototype), or digitally drawn wireframes with minimal visual styling. Once you think you’re there, you can then test with either full mocked up visuals or just straight into the digital product.

 3) Test regularly

So we aim to test every other week at Open Utility. It takes time, roughly a full day per test when you add in the admin time to collect the users, write the test, complete the test and analyse and record the results. This upfront cost is certainly worth it as it will save you time in the long run. Make sure that each test checks that any work done is still on the right lines taking into account previous feedback and also any new functionality you’re intending on building.

Another thing to add here is that not every test will be useful, but this is normal. I’d say 1 in 5 tests are duds, and by this I mean you didn’t learn anything from that person.

 4) Have the whole team involved in testing

By this I mean it’s not just the designers who should be interested and active in testing. It should be the CEO, the CTO, the sales, marketing, community management – basically everyone should be involved. The way we handle this at Open Utility is that i’m in a separate space with the user we’re testing with, completing the test according to the script. With the users’ permission the rest of the team listen in via a Google hangout, with them all on mute. This means the team and can use the chat functionality to ask me additional questions that they think of during the test and support me with answers to questions I didn’t know the answer to! This has worked well for the many remote tests we’ve used this system with so far. For face to face tests, another member of the team will join us and remain as an observer until certain breakpoints where any additional questions can be asked.

 5) Ignore 50% of what they say

It takes practise to know which bits of what users say to ignore and which bits to use. As a general rule of thumb, ignore most of what they say and concentrate on what they do. Do they stumble in a certain place on the flow? Did they see the key piece of information? These are things you’ll see, but they are unlikely to tell you which is why observational notes from their screen actions are so important. Once you’ve gathered 3 tests together, if there is an obvious correlation between something they all said, then you can assume that this pattern means it’s a useful piece of information.

If you’re thinking of running some user tests for the first time, I highly recommend the book ‘Rocket Surgery Made Easy’ by Steve Krug . It explains exactly how to run a user test from start to finish (here’s a demo) … Good luck!