A quest for the right remote usability testing tool

Posted:

Tags: #UxResearch #UxrTools #UsabilityTesting

A few days before any of us knew about the new work-from-home-normal, I began researching a new usability testing tool for Harvard Web Publishing (HWP). We had just acquired a new project that involved testing the usability of a new calendar for undergraduate students, and were considered doing a series of unmoderated usability tests, perhaps using a tool like Loop 11. We’d been considering Loop 11 because it had a lower cost than most tools and was well known for being accessible.

Two weeks or so before the project kicked-off, everyone was asked to work from home until further notice. My new task became even more crucial. I was no longer testing one particular tool for a project, I was helping determine wether we had the right tools for remote usability testing for the weeks or months to come.

To meet this goal, I piloted several tools, and spoke with colleagues from different departments in Harvard, and over at MIT. Here’s what I learned.

laptop on desk

Lessons from researching usability testing tools

UXR needs and practices vary widely between organizations, not just fields

One of the steps I took to evaluate design tools was to speak to colleagues at MIT who had a similar for-hire model. For context, HWP is hired by different departments within Harvard to conduct web and UX projects.

While chatting with our counter part in MIT, I was surprised to learn how different our business models were. A lot of the applications they worked with had external audiences, whereas HWP mostly did research within the Harvard community, with the exception of a few projects. And when we worked with external audiences, half of them were highly specialized. For example, we’re currently working with NASA and the Smithsonian on testing an app for astrophysicists.

All of this is to say, that our needs varied vastly from those at MIT. One of the main reasons they used tools like Loop 11 and Whatusersdo, was because these tools took care of participant recruiting. However, HWP had little interest in the recruiting tool as we needed users within our immediate community for most of our projects.

Remember to test among devices

I quickly realized that it was not enough to only try out different tools on a laptop, I also needed to test for mobile.

While piloting Loop 11, I discovered that it was not a great tool for remote moderated mobile usability testing. Unfortunately, there was no way for me to see the participant’s screen while the test was occurring. This was highly disappointing, because I found that Loop 11 provided fantastic data, including time on task, detailed click paths, and some ways to analyze the latter.

Another tool I was glad to test on mobile was Zoom. When we decided to try Zoom for mobile testing, we discovered that the share feature in mobile was not as straight forward as that of desktop. Piloting the share tool with colleagues helped us determine the best way to orient research participants.

Understanding the limitations of your results

One key thing to remember is that unmoderated remote usability testing means less ability for follow up. Unmoderated testing tends to be a better option when you need quantitative data, or as a way to determine key areas for further research. Some unmoderated usability testing tools attempt to compensate by including options for adding open-ended questions. However, if you want more qualitative data, I recommend going for moderated testing.

Understanding how your audience may be skewed

Unfortunately remote testing can mean a bigger learning curb for participants. One of the benefits of using participant recruiting pools provided by tools like Loop 11, UserTesting, and What Users Do is that participants within the pool have the appropriate hardware, software, and tool knowledge to smoothly participate in remote testing. Before going for one of these tools, consider your audience’s comfort with and access to required technology.

screenshot of a usability test through Zoom

What we did

In the end, I did not recommend my team to adopt Loop 11, What Users Do, or anything new. After further speaking with the client who had hired us to research the new undergraduate student calendar, we uncovered that unmoderated testing was not the right approach for the project. And, in general, we did not really have a high demand for unmoderated usability testing.

Based on our project history, most of our unmoderated research projects consisted of card sorts and tree tests. For these type of tests, as well as for the least common first click tests, we could resort to the different tools we had in our Optimal Workshop suite.

For usability testing specifically, most of our clients needed the qualitative type of information provided only through moderated testing. Because Harvard uses Zoom for remote classes and work, our audience largely already knew how to use Zoom, making it the better option for remote moderated testing.

Of course, this does not mean we will always use Zoom and Optimal Workshop. Nor does it mean that everyone should use these tools. As our projects and needs evolve, we will continue to evaluate our tools and you should do the same!