Just enough research

by Erika Hall

A book about research that can be delivered fast and is budget friendly.

Research can

  • Determine if you are solving the correct problem

  • Pin point who in organisation is likely to cause problems/have a problem with project

  • Discover best competitive advantages

  • Learn how to convince customers to care about the same things you do

  • Identify small changes that can make a huge difference

  • Become aware of personal biases and blind spots

Research is:

  • Systematic enquiry

  • There are 4 types:

  • Personal research: eg. google

  • Pure research: Science. Strong ethics, peer-reviewed. Aiming to create new human knowledge

  • Applied research: Strong ethics, research style is more relaxed than pure. Solves specific real world problems

  • Design research: Broad term. In academia it is the study of design.
    Within Industrial/interactive design it is research that is an integral part of the design process – focuses on understanding the people/end user [and how they interact with the product].

Research will:

Give you stronger arguments, clarity or purpose, freedom to innovate, know your constraints

Design research is not:

  • Asking people what they like

  • Not a political tool

  • Not a science (it is qualitative not quantitative)

FINDING YOUR PURPOSE

You need to define a purpose and a topic.

To find the purpose:

Generative / exploratory research: “What’s up with…?”

  • High level “What is a good problem to solve?”

  • Can include interviews, field observation, reviewing literature etc

  • Study research to pick out common themes – may lead to a hypothesis that you then explore in more detail through descriptive & explanatory research

Descriptive & explanatory research: “What & how?”

  • Similar activities to above, but ‘lower level’ - What is the best way to solve the problem I have identified?”

  • From users perspective not own

Evaluative research: “Are we getting close yet…?

  • Design potential solutions. Test against identified problems and iterate. Most common is usability testing

Casual research: “What is this happening?”

  • After launch/implementation, may need to review outcome. Establishing cause-and-effect can be tricky. Look at analytics and conduct multi-variate testing

Roles/tasks to cover:

  • Author

  • Interviewer/moderator

  • Co-ordinator/scheduler

  • Notetaker/Recorder

  • Recruiter

  • Analyst (have more than one)

  • Documenter

  • Observer

During the research process

  • You will hear objections
    (Chapter 2 - loads of advice on predicted objections and suitable responses)

  • Be prepared for research in any context
    (Chapter 2 – again advise for several different situations)
    eg. freelance, client services agency, in-house at an established company, in-house at a startup, working with an agile dev team

  • Apply just enough rigor
    Be aware of own bias: Design bias (of study); sampling bias; interview bias; sponsor bias; social desirability bias; the Hawthorne effect (my presence may change behaviour of users)

  • Be aware of ethical concerns
    The project as a whole: Is it ethical?; The goals or methods: Are you tricking your participants or setting unrealistic expectations about the world?; Consent and transparency: Informed consent is the rule; Safety and privacy: Ensure my presence doesn’t create a risk.

  • Be a skeptic – ask lots of questions and constantly be on the lookout for threats and potential points of failure

Best practices

  1. Phrase questions clearly

  2. Set realistic expectations

  3. Be prepared

  4. Allow sufficient time for analysis

  5. Take dictation

How much research is enough?

  • Avoid unnecessary research
    Identify highest priority questions – assumptions that carry the biggest risk
    (research provides clarity and confidence to design, and reduces the risk that could be incurred by relying on false assumptions or focusing on the wrong business goals). Erica Hall provides a list of questions to ask.

  • You will know it’s enough when all the pieces click into place

THE PROCESS

  1. Define the problem (What you want to find out)
    Base statement on a verb that indicates an outcome e.g. “describe,” “evaluate,” or “identify” (not open ended e.g. “understand”, “explore”).

  2. Select the approach (How you will find out)
    Problem statement will direct you to type of research. Write quick description of study by incorporating question.

  3. Plan and prepare for the research

    • Identify the plan keeper

    • Formulate a plan. (Budget, time, roles. Identify subjects and a recruitment plan. Make a list of materials.)

    • Recruit people (EH has lots of notes on this)

      • Write a screener

  4. Collect the data

    • Photos, videos, screen captures, audio recordings, hand written notes… get everything backed up and/or onto a share drive asap

    • Be organised – e.g. agree naming conventions

    • Pick easiest to use tools for team (see resources section

    • Interviews are the most effective way to get inside a users head. (for success prepare, structure and conduct)

    • Usability testing (a directed interview while user uses the prototype/product)

    • Literature review

  5. Analyse the data

    • Gather all data and look for meaningful patterns

    • Turn patterns into observations

    • Observations will lead to recommendations

    • Refer back to initial problem statement and ask how the patterns answer original questions posed

    • Get everyone involved in the analysis – structure a session (lots of notes on how to this from EH)

    • When analysing the data look for: Goals, Priorities, Tasks, Motivators, Barriers, Habits, Relationships, Tools, Environment

  6. Report the results

    • Write up a brief, well-organised summary to include: goals, methods, insights and recommendations

    • Include quick sketch personas

    • Photos of whiteboards, stickies etc

  7. And repeat

ORGANISATIONAL RESEARCH

Business analysts normally do this kind of research – what drives a business, how all the pieces work together, and the extent of its capacity for change. But the process is very similar to traditional user research. The nature of an organisation matters to a design process, so it is vital that ones speaks to the stakeholders to understand. You should interview:

  • Executives

  • Managers

  • Subject matter experts

  • Staff in various roles

  • Investors and board members

EH has loads of advice on how to structure and interview stakeholders (Chapter 4)

[The rest of this book summary is copied verbatim from Ananda Vickry Pratama’s excellent book summary on Medium (it’s a great book and will be forever referencing back to it). Thanks for doing the summary!!]

Who are stakeholders?

  • Defined as “those groups without whose support the organization would cease to exist.”

  • Executives, managers, subject matter experts, staff in various roles, investors and board members.

Interviewing stakeholders

“Interviews with project stakeholders offer a rich source of insights into the collective mind of an organization. They can help you uncover areas of misalignment between a company’s documented strategy and the attitudes and day-to-day decision-making of stakeholders. They can also highlight issues that deserve special consideration due to their strategic importance to a business.” — Steve Baty, “conducting successful Interviews with Project stakeholders”

What interviewing stakeholders is for:

  • Neutralizing politics.

  • Better requirements gathering.

  • Understanding organizational priorities.

  • Tailoring the design process.

  • Getting buy-in from said stakeholders.

  • Understanding how your work affects the organization.

  • Understanding workflow.

Types of interviews:

  • Individual interviews.

  • Group interviews.

  • Email interviews.

  1. Interview structure

  • Introduce yourself.

  • Explain the purpose of the meeting.

  • Explain how the interview data will be shared.

  • Be sure that people can speak freely.

2. Dealing with hostile witnesses

  • Do your research ahead of time to predict if a stakeholder might be combative.

  • Remain calm and confident (practice with members of your team beforehand).

3. Documenting interviews.

What to do with stakeholder analysis

What you should include in this documentation:

  • Problem statement and assumptions.

  • Goals.

  • Success metrics.

  • Completion criteria.

  • Scope.

  • Risks, concerns, and contingency plans.

  • Verbatim quotes — Very valuable but try to anonymize them.

  • Workflow diagrams (see graphic).

A workflow diagram can describe the current situation or illustrate your recommendation based on you’ve learned about the organization

USER RESEARCH

When we talk about user research as distinguished from usability testing, we’re talking about ethnography, the study of humans in their culture. We want to learn about our target users as people existing in a cultural context. We want to understand how they behave and why.

Everything that factors into context

  • Physical environment

  • Mental model

  • Habits

  • Relationships

Assumptions are insults

Getting good data from imperfect sources

What is ethnography?

The fundamental question of ethnography is, “What do people do and why do they do it?” In the case of user research, we tack on the rider “…and what are the implications for the success of what I am designing?

The four Ds of design ethnography

  • Deep dive — Get to know a small but sufficient number of representative users very well

  • Daily life — It’s of limited utility to learn how people behave in your conference room so go to where they live and work.

  • Data analysis — Systematic analysis is the difference between actual ethnography and just meeting interesting new people at a networking event.

  • Drama! — Lively narratives help everyone on your team rally around and act on the same understanding of user behavior

Interviewing humans

  1. Preparation

  2. Interview structure: Three boxes, loosely joined

  • Introduction: Say hello, express gratitude, talk about why you’re there,
    review demographic information

  • Body: Ask open-ended questions, follow up or probe as necessary, allow
    pauses and silences.

  • Conclusion: Express gratitude again, ask if they have questions, talk about
    next steps.

3. Conducting the interview

  • Don’t forget to breathe

  • Practice active listening (Nod and say “mm-hmm” but pay close attention)

  • Keep an ear out for vague answers

  • Avoid talking about yourself

Contextual inquiry

Contextual inquiry is a deeper form of ethnographic interview and observation. It is particularly useful for developing accurate scenarios, the stories of how users might interact with potential features.

Things to keep in mind

  • Travel​ — Allow plenty of time to get to the site and set up.

  • Get situated​ — Find a comfortable spot that allows you to talk to the participant without interrupting their normal routine.

  • Interview​ — Establish trust and learn about what you will be observing. Find out when it will be least disruptive to interrupt and ask questions.

  • Observe​ — It’s a show. You’re watching. Note everything in as much detail as possible. The relevance will be apparent later. Pause to ask questions. Stay out of the way.

  • Summarize​ — Conclude by summarizing what you learned and asking the participant to verify whether your observations were correct.

Focus group — Focus groups are the antithesis of ethnography.

COMPETITIVE RESEARCH

You need to know not only who your competitors are from the perspective of the business (that’s generally obvious) but who competes for attention in the minds of your target users.

SWOT analysis — Plotting out strengths, weaknesses, opportunities, and threats

A SWOT analysis organized in a simple grid can help you grasp your competition position.

Competitive audit — Once you have identifed a set of competitors and a set of brand attributes, conduct an audit to see how you stack up.

Brand audit — Your brand is simply your reputation and those things that signify your identity and reputation to your current and potential customer.

Here are the questions you need to ask about your brand:

  • Attributes

  • Value proposition​

  • Customer perspective

Name — The name is the single most important aspect of a brand.

Logo — The logo is simply the illustrative manifestation of your brand, which can take several forms: wordmark, bug, app icon, favicon, etc.

Usability-testing the competition — Just what it sounds like. Take those usability testing skills and apply them to someone else’s product or service.

EVALUATIVE RESEARCH

Evaluation is assessing the merit of your design. It’s the research you never stop doing. There are several ways to go about it, depending on where you are in the project.

Heuristic analysis — “Heuristic” in English simply means “based on experience”; a heuristic is a qualitative guideline, an accepted principle of usability. The method is very simple: evaluators (at least two or three, ideally) individually go through a site or application with a checklist of principles in hand and score the site for each one. Nielsen’s ten heuristics.

Usability testing — Usability is the absolute minimum standard for anything designed to be used by humans. If a design thwarts the intended users who attempt the intended use, that design is a failure from the standpoint of user-centered design. Usability is a quality attribute defned by five components by Nielsen.

Do the cheap tests first and the expensive ones later:

  • Start with paper prototypes and sketches

  • Look at and test competitor’s products

  • Test at every stage (as much as time will allow)

  1. Preaparing for usability testing

What you need:

  • A plan (What are the tasks that you need to cover? Include the tasks as part of a larger scenario so the user can understand the context.)

  • A prototype or sketch.

  • Four to eight participants of each target user type based on personas (ideally) or marketing segments.

  • A facilitator.

  • An observer.

  • One or more methods of documentation.

  • A timer or watch

Recruiting — Recruiting for usability testing is substantively the same as for ethnographic interviews.

Facilitating — A good facilitator is personable and patient.

Observing and documenting — Even if you are set up to record, it’s very important to have a second person observing the tests and taking notes.

Eye-tracking — Eye-tracking measures where someone is looking, how long, and in what direction.

2. Analyzing and presenting test data

The aim of usability testing is to identify specific significant problems in order to fix them. The outcome is essentially a ranked punch list with a rationale.

ANALYSIS AND MODELS

Analysis involves a few simple steps:

  • Closely review the notes.

  • Look for interesting behaviors, emotions, actions, and ver- batim quotes.

  • Write what you observed on a sticky note (coded to the source, the actual user, so you can trace it back).

  • Group the notes on the whiteboard.

  • Watch the patterns emerge.

  • Rearrange the notes as you continue to assess the patterns

Affinity diagram — Clusters of related observations. Each cluster then lets you extract insights and make recommendations.

Steps:

  • Write down observations

  • Create groups, noting all stated and implicit goals

  • Identify next steps

An affinity diagram helps turn research into evidance-based recommendations

Creating personas — A persona is a fictional user archetype — a composite model you create from the data you’ve gathered by talking to real people — that represents a group of needs and behaviors.

A persona document should feel like the profile of a real individual while capturing the characteristics and behaviors most relevant to your design decisions

Mental Models — A mental model is an internal representation of something in the real world — the sum total of what a person believes about the situation or object at hand, how it functions, and how it’s organized.

Creating a mental model

  • Do user research.

  • Make an affinity diagram.

  • Place affinity clusters in stacks representing the user’s cognitive space to
    create the model. These groups will include actions, beliefs, and feelings.

  • Group the stacks around the tasks or goals they relate to.

Mental model diagram illustrate your user’s thought processes in detail.

Task Analysis/Workflow — Task analysis is simply breaking one particular task into the discrete steps required to accomplish it.

This task path for ticket purchase can help identify areas where the user needs specific content and functionality to meet her goal.

QUANTITATIVE RESEARCH

Qualitative research methods such as ethnography and usability testing can get you far, but you still won’t get everything right. Once your web site or application is live, then you have quantitative data to work with.

Optimizing a design is the chief aim of quantitative research and analysis

Conversions — A user is said to convert any time they take a measurable action you’ve defined as a goal of the site. Measuring the conversion rate for each of these will indicate the success of that particular path, but not how each type of conversion matters to the success of the organization itself. That is a business decision.

Analytics — Analytics refers to the collection and analysis of data on the actual usage of a website or application to understand how people are using it. Over half of the world’s websites have Google Analytics installed.

Some of the basic stats to look at include:

  • Total number of visits.

  • Total number of pageviews.

  • Average number of pages per visit.

  • Bounce rate (the percentage of people who leave after viewing one page).

  • Average time on site.

  • Percentage of new visitors

Split Testing (a.k.a. A/B Testing) — This method is called split testing because you split your trafic programmatically and randomly serve different variations of a page or element on your site to your users.

How to do it:

  • Select your goal.

  • Create variations.

  • Choose an appropriate start date.

  • Run the experiment until you’ve reached a ninety-five percent confidence level.

  • Review the data.

  • Decide what to do next: stick with the control, switch to the variation, or run more tests

CONCLUSION

Form questions. Gather data. Analyze. One sequence, many approaches. Get started (right now!) and encourage you to develop a research habit wherever and however you work.

Previous
Previous

Affinity diagram for Fly UX

Next
Next

Usability tests for Fly UX