Skip to Main Content

User testing can be intimidating for those who are new to the task. It’s a strange thing because while it requires a test facilitator to tap into their natural soft skills and intuition, it also requires them to fight some of their natural instincts. Here are some of our dos and don’ts for strategic approaches and disciplines when user testing based on decades of experience.

Setting goals for user testing

Don’t: Conduct user testing for the sake of conducting user testing

Sometimes user testing can feel like an obligation. It’s not an entirely bad instinct to follow but without a goal the return on investment for this type of research is lousy.

✅ Do: Test with a specific goal in mind

Have a plan. Know more specifically what you want to get out of interviews. Goals typically fall into one of three categories: discovery research helps to identify pain points and problems that need to be solved; concept validation can confirm or give critical feedback on an early-stage solution; usability testing identifies gaps in the design that need to be refined before launch. When there is a well-defined goal it is more likely that there will be tangible, applicable outcomes versus generic sentiment.

Should you use focus groups for user testing?

Don’t: Spend time and money on focus groups

We’ve seen focus groups go off the rails pretty quickly. Focus groups seem efficient on paper but can yield much lower value than other forms of testing. They tend to be prone to groupthink being steered by the few dominant voices in the room. For example, we once had a partner share focus group recordings where one participant started to worry about security and got the entire group worked up to the point where they kept coming back to security, ignoring all other central topics that were on the agenda for the group to discuss.

✅ Do: Conduct one-on-one user interviews

You don’t have to do many one-on-one conversions to get a lot out of them. We typically shoot for five to eight interviews that last no more than 60 minutes. You’ll have more power to control the conversation, ask follow-up questions, and go deeper on the things you need to uncover.

How to recruit for user testing

Don’t: Recruit from your address book

It can be tough to recruit users you don’t know but your colleagues, friends, and family are also less than reliable sources of data for your research. They have a vested interest in making you happy so they can have pleasant interactions in the lunchroom and at the Thanksgiving table. For this reason, they are more likely to produce false positives.

Don’t: Recruit the same testers over and over

There are some test participants that are engaged and insightful and would be lovely to go back to over and over again. However, if overtapped, this sample of test subjects will pigeon hole and bias the learnings. Their feedback gives researchers the illusion that they are getting to know the user audience when in reality, they are just getting to know (and only design for) a small segment of the user base.

Don’t: Recruit at bus and train stops 

We’ve done it and we’re not proud of it. <Insert Silicon Valley meme here.>  While it is a captive audience, it’s not the most reliable data source. Innocent bystanders turned user-testers will jet as soon as they can and are more likely to give you false positives because they are not invested in the topic and don’t want to displease a stranger.

✅ Do: Recruit screened and qualified test participants

Screen your test subjects before booking them for interview sessions to verify that your test participants have some level of vested interest in the topic at hand. This is made easy with services like usertesting.com or userzoom.com. Typically we receive a set of qualified matches back within a few days and are able to take our pick from that list. It is very rare that there is an unqualified test subject who slips through the screener with the pure motivation of collecting the fee and most providers have refund policies in place when this does happen.

Designing user test questions

Don’t: Test for general likability

It is an exercise in futility to simply throw an idea out into the wild just to see how people react. If the only question you are trying to answer is “did users like this?”, learnings will be shallow and enter into the danger zone of becoming a self-fulfilling prophecy. In this scenario, you’re more likely to get affirming feedback because you’re not being hard enough on your ideas or designs.

✅ Do: Test specific hypotheses

Being critical early in the product development process, while personally challenging, can pay off in the long run. User testing is an opportunity to check the riskiest of your assumptions. One way to do this is to develop three to five hypotheses – best guesses in the form of an if/then statements. Everything included in the scope of the test should be related to these hypotheses. This limits the scope of the test to the things that are the most important to address and build better understanding around so that you can make better informed decisions.

Kicking off a user test

Don’t: Start the interview by showing off the prototype

Fight the urge to dive right into showing off the big idea. Testers typically need to be warmed up and there is important information to collect before the tester figures out your agenda.

✅ Do: Start the interview with an unbiased set of questions

Use the first few minutes to get to know your tester. Use this time to measure existing behaviors and opinions before the tester gets wind of the testing agenda. Occasionally, testers will contradict their pre-prototype responses when reacting to the prototype. You can use these earlier responses to press for a more genuine answer. These questions can also be used to identify the right persona or user segment of each tester which can be illuminating in the analysis process.

How to conduct a user test

Don’t: Act as a personal tour guide for the prototype

It will require restraint to not jump in and talk up the prototype. The genuine excitement you feel for your creation will have to be subdued and should not be detected by the test subject. This would only further condition them to produce false positives.

✅ Do: Let the prototype speak for itself 

Let test subjects figure the prototype out on their own. Embrace the awkward silence and contemplative looks from the test subjects as they take it in. In the real world it’s unlikely that a first time product experience would come with a hype woman or man. A good rule of thumb is to be more interested than interesting.

Don’t: Answer participant questions

Test subjects can sometimes be unsure they fully understand how something might work and their natural instinct is to ask the test facilitator to confirm. Don’t get into the habit of answering their questions especially if they are asking about hypothetical features and functions. It becomes harder to steer the conversation back to the topics you need to explore when the test subject is asking all the questions.

✅ Do: Make participants answer their own questions

When applicable, flip the question back on the user. “What would you recommend?”, “When might you be likely to use something like that?”, “How would that affect your process?”. These types of questions keep the test subject sharing their perspective which makes this a worthwhile investment of resources.

This is a lot to keep in mind but you don’t have to be perfect for every test. Work on improving gradually. Before planning your next round of testing remind yourself of the do’s. After the test, check to see if there was evidence of any don’ts. A good way to do this is to listen to yourself on interview recordings. It’s an agonizing task, we know, but you’ll catch mistakes you made and didn’t realize and you’ll familiarize yourself with how you can improve your questioning technique. You don’t have to re-listen to every interview but it’s a good skill sharpening activity to do on occasion. Paying attention to the dos and don’ts over time will make these skills more intuitive and will lead to more successful and more valuable testing.

Honing user testing skills and best practices is one of the most powerful tools we have at ADK for helping partners create new or upgraded products. If you have any questions on how we can help make user testing drive value for your product, please let us know!