Amperity User Experience Research (UXR)

March 01, 2022

A beginner navigating the design & execution of a UXR study.

Role: Plan design, research & interviewing.
Team: Shaun Yap, Senior Manager, Web Marketing // David Stychno, Design Director
Also consulted Ample & Cassie W, Director of Product Design.


Web features were often added based on perceived need, a process which is subject to blind spots and bias. I piloted a user testing program to conduct qualitative research that the team could use to test hypotheses & improve site user experience.



Our small team of two had limited funding & time to allocate to user research. Because of this, we needed a platform that offered a short runway to launch. In addition, recruiting the right audience was essential to getting valuable feedback. After evaluating several tools (including the very robust UserTesting), I landed on UserInterviews, a platform that boasts targeted recruiting and proved to be cost effective as well.


I worked with the UX team at Ample to develop a few user personas based on existing internal documentation and web usage data. The final three personas helped me define criteria for recruitment & clarify direction for the study.

Study Design

I went into this project knowing nothing about how to run a user study. Fortunately, my colleagues were supportive and enthusiastic. UserInterviews offered an incredibly detailed UX Research Field Guide; I sat down with this resource & read it from beginning to end. I also spoke with Cassie & Dave, both seasoned designers with interviewing experience, to get some tips from the pros.

From there, I was able to draft out the following sketch:

Recruitment Criteria

Category Description
Audience Employees from B2C companies in our Ideal Customer Profile, as well as existing customers
Location US, Canada, and Australia
Department Marketing, Analytics, IT
Device Desktop & mobile (mobile accounted for 40% of traffic in 2021).

I also defined filters by industry, revene, and job title, not listed here.


Because this would be Web’s first UXR study, I wanted to test out a couple of different approaches to see how the study could scale.

In an unmoderated session, participants were given the site URL and asked to record themselves navigating the site. They were also given an a document with questions to be answered before, during, and after the experience.The team could then review recordings on our own time, saving us time during interviews & allowing us to reach audiences in different time zones.

In a moderated session, participants were given instructions for navigating the site as the interviewer & note-taker observed. The interviewer would ask the user to walk through their thinking where appropriate; the note-taker would jot down notes. A few minutes before the session ended, participants were shown prototypes of new designs & then asked to rate their experience.

A permutation of the moderated session was the interview with customer, where we got face-time with a couple of our existing customers. Customers work completely in the Amperity platform, so they are not a primary audience for the site. It was a good opportunity nonetheless to get feedback on our product messaging from folks who knew it well.

Each type of interview lasted for 45 minutes to an hour; participants were compensated for their time. Participants scheduled through the UserInterviews platform, and moderated sessions were conducted over Zoom.

A few different methods were ultimately employed: usability testing (screen recordings), task analysis, user feedback survey, and user interviews.


I wasn’t really prepared for the amount of writing that designing a UXR study would involve.

The screener survey, despite only being ten questions long, took significant time to craft. It was important to word questions in a way that would screen for qualified applicants without letting my own bias interfere. For example, to get an idea of the applicant’s experience with navigating a similar site, I asked: “How often do you use a B2B website?” This open-ended question helped determine whether a user would be familiar with the site structure & layout designs of other B2B sites.

For the interview script, I borrowed a template from our Product Design team, re-writing most of the document to generate three scripts for each scenario: moderated, unmoderated, and customer interview.

Impact & Learnings

The project was a success. We interviewed a total of 11 professionals across various locations, industries, and roles, accumulating pages and pages of notes. I felt the role of Interviewer came naturally because I 1) was genuinely interested in user feedback and 2) had experience doing something very similar as a housing case manager, conducting triage on clients.

As the team gained more valuable notes, we began experimenting with the interview format & direction — for example, doing some light brand testing or getting feedback on prototypes & work in progress.

Some key learnings as an interviewer:

  1. Refrain from asking leading questions - This was easier said than done. Fortunately, my colleagues who were not afraid to let me know when I asked leading questions, and I was able to better self-monitor in subsequent interviews.
  2. Develop rapport with the interviewee - It was easy to overlook this step at first, but I quickly realized that the moments during introductions can really set the tone for the rest of the interview.
  3. Use interviewee responses in follow-up questions - A true sign of active listening; I could demonstrate that I was picking up what they were putting down.

Once the team was able to consolidate findings and nail down three to four key learnings, we took action. Small changes were addressed immediately; significant chunks of website content & design were updated within the month. The findings also launched a larger project around the information architecture and overall messaging of the website.

Screenshot of a slide deck outline view.
The UX learnings deck identified many areas of improvement across the site.

Future Considerations

  • Expand the number of sessions
  • Expand the number of pages on the site “within scope”
  • Modify the study to address specific hypotheses
  • Utilize the platform to do more focus groups around brand or big campaign ideas
  • Try out a different UXR platform for a different recruitment pool