Research

Principles (yes, they are necessary)

What I do when someone says “we need research”

Then, do the research: a real-world example

Principles

Learn (and use) something you don’t know.

The point of applied research, which is what I do most of the time, is to learn something and apply what is newly learned.

Lots of people understand this, but… more do not. Teams ask for research when they “already know” what the outcome will be. Or they have no intention of changing direction no matter what I uncover in my work.

This unfortunate and common situation makes research a low-impact exercise that grinds researchers down to checking boxes instead of working hard to find answers. And it’s why I ask about assumptions or predictions, as well as what a team is prepared to do if results come back recommending major changes.

We could make this about respecting expertise and sometimes it genuinely is. But most often it’s bias that is in the way.

That’s why it takes a skilled researcher, facilitator and communicator to help people understand results and feel safe changing direction.

Answer the Next Best Question.

Next, I help teams establish clear goals for all of the research I will do, then I break down how we get to those goals. I narrow the focus down to the next best question we can ask, and do a little predicting about subsequent questions.

Taking the time to do this ensures buy-in from everyone who will have to take action based on the results — and gives me a single question I can use do my first study and build confidence and trust with a team.

My spider sense for what it’s going to take to get to “done” is pretty accurate most of the time. I can usually estimate based on the scope of the problem, and the complexity of the goal. questions.

Yup. I use agile.

Sometimes all my questions are answered in one study. Sometimes they aren’t. Sometimes we pivot the research or the team. Sometimes we come up with a completely new product.

Synthesis: most valuable & overlooked player.

So many teams settle for a few slides of tepid, simplified, neutral insights. So many researchers think that’s all they should deliver because they don't decide what the direction will be anyway.

But that is not how people actually respond in studies, nor is it what you will respond to when I share results.

People have strong feelings. You want clear answers.

I have a trained ear and eye to see beyond a pile of transcripts and recordings into the beliefs and needs that sit behind what people say. I go deeper than “people want this to be easy” and tell what easy actually means for customers.

I turn outputs of a study into a) what we can do with new knowledge that is b) relevant to the people whose problems you want to solve.

What I do when someone says “we need research”

Determine the Purpose: Fundamental, Applied or Speculative?

Fundamental research is there to simply increase understanding of a problem space. I use this when there’s low fluency around a market, problem or product.

Applied research is the kind that seeks to find and understand practical solutions to specific problems. Most commercial research falls into this category because an existing company has usually chosen its problem to solve, and has some kind of solution in-market already.

Speculative research is exploratory, not bound by current solutions and having no profit-motivated end in itself. Usually this shows up in R&D labs, which wax and wane in companies depending on who is leading them and their appetite for risk.

Even though I’m typically aiming at answering only one question in a study or several similar questions in a larger plan, there are always these threads people can choose to pull on in my findings and synthesis. Those can go in some really interesting directions.

I’m usually directly asked to do applied research but end up delivering a good dose of fundamental with a dash of speculative.

Set the Approach: Qualitative or Quantitative?

Quantitative research (quant for short) will tell you how much or how many, how many times, and how often. Market research is usually quant and can be eerily accurate in how it predicts choices. Behavioral science relies on this a lot.

Qualitative research (qual for short) will give you the why behind those choices. And the why can help you solve problems in focused, unique, and more successful ways. It also delivers better results if you’re looking for seriously new ideas.

I usually try to pair them together to get the best recommendations possible. The qual synthesis usually involves direction-setting, while the quant helps prioritize what will have the best impact.

Choose the Methods: Whatever will ethically answer the question

Fundamental research

  • Heuristic analysis and baseline usability testing

  • Data from an analytics team, marketing data & strategy customer success and service data, and customer verbatim

  • Competitive and sector research to understand the space a solution is in

  • General desk research or online to learn the subject matter

  • Ride-alongs with employees, such as sales or customer support

Quantitative research

  • Baseline surveys about the current product and any current available results from Qualtrics surveys and the like

  • Post-launch surveys to understand success of a new release

Qualitative research

  • Grounded theory, ethnographic observations or interviews, diary studies

  • Co-creation and collaborative design charrettes

  • I don't do focus groups because they generate groupthink

The fun part! Getting creative about the actual methods. This is where experience with many methods comes in hand — I can choose methods that I know in advance will deliver the kind of results I need to have the best impact possible.

Then, do the research

A real-world example of going from idea to concept to usability testing in the same problem space


Where we started

I was hired to lead the design of a new service that would bring goals-based investing advice to people who needed it. I worked with a lead researcher. We co-designed and ran our studies together.

For about 3 months while I onboarded into the company, we started out with a lot of behavioral science concepts and tested them. They had mixed results and I couldn’t see how they’d weave the whole service together.

We also worked on integrating research into our agile process.

Finding the next best question

I was staring at a whiteboard one day. My partner came over and said hey, you look frustrated. That was when we decided to only answer the next best question, which is honestly the best way I’ve ever found to deal with gnarly wicked problems in service design work.

For the next 2 months, we went into homes and talked to people about their values, goals and beliefs around money. We used the theory of planned behavior change, psychological distance, and the theory behind goals-based investing in that work.

Things started to click. I came up with how we would have people select goals and how we would get people to see where their money was going.

Testing, testing, 1…2..3

Slow is smooth, smooth is fast— now we really had ideas to work with.

Over the following 3 months, we iterated our way into a complete onboarding design, which was the critical (and kinda cumbersome) part of the service. We had to take people from being sorta curious and possibly defensive to definitely interested, then all the way to opening an account, asking for personal information the whole time. Smoothly and in a way that felt easy, and above all, not patronizing.

Previous
Previous

Building a new high-impact design team

Next
Next

Best. Workshop.Ever