How we built a system to let our teams run their own user research projects

Carolyn Gearig
WhereByUs
Published in
6 min readNov 21, 2019

--

At WhereBy.Us, we run five local media brands in Pittsburgh, Seattle, Portland, Orlando and Miami, supported by small central teams including sales, growth and product and local. Our product team spearheads user research and maintains a catalog of observations and insights, and our teams and cities all have unique research questions, like:

  • What triggers a reader to open one of our newsletters? How much do different things like subject line, time of newsletter send and day of the week contribute to this?
  • What motivates readers to become a member of one of our local brands?
  • What can the needs of content creators in different cities tell us about possible expansion opportunities?

Our product team is small, so we don’t have the resources to run all of these projects on our own. Our solution? We built documentation and ran user research training to empower teams to design and conduct their own projects with assistance on organization, methods, and sensemaking.

Having infrastructure to support our teams to do their own user research is a logistical solution but has other benefits, too. We run local media brands, and no one is more familiar with our different users than the people on the ground. Our engaged readers often come to in-person events, so having our local directors, for example, run user interviews helps us build and strengthen relationships. When our other teams such as sales have specific research questions, they’re more equipped to dig into specifics on customer jobs to be done because they work with customers all the time. User research doesn’t have to be something only handled within a single team — we think it’s better for this work and the insights from it to be shared across the company.

Our five-step workflow for designing and running user research projects

In our documentation, we break up a user research project into five parts outlined in a workflow document. I’ll be linking to public copies of our documentation, which lives here — feel free to adapt for your own needs!

  1. Initial research planning consists of initial planning and two half-hour research scoping and logistics planning meetings that usually can be accomplished in a one-week sprint. Throughout this planning sprint, team members fill out a research plan document, with different questions to consider throughout.

First, a team member should jot down initial research thoughts:

Then, relevant team members and a point person from the product team meet to determine what we want to research and the correct method for it. Next, a smaller meeting is held to answer logistical questions, like how we’re going to find interview participants or survey respondents, how we will compensate them, and what our research timeline looks like.

2. The next step consists of writing survey or interview questions and scheduling and conducting research. This is the bulk of the project and will probably take two or more sprints of work, depending on the number of interviews being run — we estimate that tasks for each interview takes around an hour.

3. Compiling your results consists of noting initial survey and interview takeaways and cataloging interview audio recordings and survey data.

4. For each project, we run a one-hour sensemaking meeting where we review who we interviewed or surveyed against our research question, discuss observations and insights from research, and ensure we’ll be acting on the right insights.

5. We end each project with a one-hour cataloguing and retro workshop with team members who participated in user research and at least one point person from product. We catalogue results in our user research database, then run a traditional retro, where team members share and discuss what went well, what needs a little more work, and what went poorly.

At each step of the workflow document, we link to relevant documentation, like in this overview for scheduling and conducting research:

All of this is organized in a folder of our team Google Drive.

Survey and interview documentation

Most of our team members have never run surveys or interviews, so we put together documents and training to explain interview and survey logistics and basic principles that run through things like benefits and drawbacks for each research method, how to write effective survey questions, how to know you’re getting meaningful results and logistics around recruitment, research scheduling, participant compensation and more.

We also created a document called “Should I use a survey or interview?” that provides quick guidance for choosing between two of our most common research methods, keeping in mind factors like project timeline, whether or not the questions are ongoing, and desired number of participants.

How I onboarded team members

I ran a videoconference to train and onboard our team members and to give them time to think of possible research projects during our fourth quarter planning process. I addressed everyone’s questions and updated our docs accordingly. Here’s our onboarding presentation, which we’ll share with new teammates too.

Onboarding also happens on a case-by-case basis; some projects are simple enough to run on these docs, while others may need training on specifics like a specialized type of interview or survey. For example, our engagement producers are trying to learn more about our members and had little experience with user interviews, so I put together some slides and ran a small onboarding session to help them get started.

So, how’s this all working?

This quarter, we’re in the middle of three research projects around membership, newsletter opens and expansion opportunities, and here’s some of what we’re learning:

  • Planning varies a lot project to project. We went into our membership and expansion projects with a clear idea of what we wanted to research, which made kickoff work, the research scoping meeting and the logistics planning meeting very simple and quick — it took 30 minutes to get through all of the planning work for expansion, while our local team needed several meetings with different stakeholders to decide on a research plan.
  • It’s overwhelming to have many sets of user interviews going on at once, even if they’re being run by different teams. If we must have lots of concurrent projects, it’s best to set recurring time blocks 2–3 days per week for user interviews across teams to ensure the project team can still accomplish their production and other sprint tasks.
  • Best practices for recruitment vary by project, but generally, if we’re targeting a wide swath of readers, we found that the best way to encourage participation was to offer a free advertisement in one of our newsletters. We wanted to offer smaller incentives like a free month of membership or referral credit, and originally were targeting individual subscribers who met different requirements through MailChimp, but found that this level of specificity and variance was generally not worth the time and effort required to manage it.
  • Participant compensation should be static throughout the project. We changed compensation halfway through our local user research project to encourage more participation and ran into logistical complications. Moving forward, we plan to set compensation guidelines once in our scoping phase to make fulfillment much easier.
  • We needed a static processing checklist for each interview that includes recordkeeping tasks like note-taking, audio recording, audio transcribing, logging in our UX database and more. For now, one person is assigned to do all of these things, but as we learn, we think this should be included as a part of each interview’s done condition. I made this checklist for future projects.

As our projects progress, we’re constantly updating our documentation. For now, you can view it all in this public Google folder, and we’d love it if you adapted it for your own projects.

Have you put together similar documentation or empowered multiple teams in your workplace to conduct user research? What have you learned? Do you plan to do this in the future? Reach out at carolyn@whereby.us — I’d love to hear about it and learn with you.

--

--