Design Research

Concept testing

Concept testing is a research method that involves getting user feedback during the upfront part of the design process to give feedback on potential solutions.

Source: Victor Yocco, The Value Of Concept Testing As Part Of Product Design

  • Figma - 31.31% of question respondents
  • UserTesting - 20.2%
  • Zoom - 19.19%
  • Google Meet - 8.08%
  • Qualtrics - 8.08%
  • UserZoom - 7.07%
  • Maze - 6.06%
  • Miro - 5.05%
  • Microsoft Teams - 5.05%
  • InVision - 5.05%
48.29% of participants answered this question.

Card sorting

Card sorting is a method that helps you discover how people understand and categorize information. In a card sorting study, participants group ideas or information written on cards into categories in a way that makes sense to them.

Source: Maze, Card Sorting: Improving Information Architecture

  • Optimal Workshop - 43.52% of question respondents
  • UserTesting - 14.11%
  • Miro - 12.94%
  • UserZoom - 10.58%
  • OptimalSort - 8.23%
  • Maze - 5.88%
  • FigJam - 4.7%
  • In-person - 3.52%
  • Google Slides - 3.52%
  • Qualtrics - 2.35%
41.46% of participants answered this question.

Tree testing

Tree testing is a usability technique for evaluating the findability of topics in a website. It is also known as reverse card sorting or card-based classification.

Source: Wikipedia, Tree testing

  • Optimal Workshop - 50% of question respondents
  • UserZoom - 13.33%
  • UserTesting - 11.66%
  • OptimalSort - 11.66%
  • Treejack - 3.33%
  • UsabilityHub - 3.33%
  • Miro - 3.33%
  • UXArmy - 1.66%
  • PlaybookUX - 1.66%
  • Maze - 1.66%
29.27% of participants answered this question.

First click testing

First Click Testing examines what a test participant would click on first on the interface in order to complete their intended task. It can be performed on a functioning website, a prototype or a wireframe.

Source: Usability.gov, First Click Testing

  • UserTesting - 25% of question respondents
  • Optimal Workshop - 20.83%
  • UserZoom - 18.75%
  • UsabilityHub - 16.66%
  • Maze - 10.41%
  • Figma - 6.25%
  • Internal tools - 4.16%
  • Qualtrics - 2.08%
  • UXArmy - 2.08%
  • SurveyMonkey - 2.08%
23.41% of participants answered this question.

Task analysis

Task analysis is the process of learning about ordinary users by observing them in action to understand in detail how they perform their tasks and achieve their intended goals. Tasks analysis helps identify the tasks that your website and applications must support and can also help you refine or re-define your site’s navigation or search by determining the appropriate content scope.

Source: Usability.gov, Task Analysis

  • UserTesting - 27.41% of question respondents
  • Zoom - 14.51%
  • UserZoom - 12.9%
  • Figma - 9.67%
  • Google Meet - 8.06%
  • Miro - 6.45%
  • Maze - 6.45%
  • Lookback - 4.83%
  • In-person - 3.22%
  • Interview - 3.22%
30.24% of participants answered this question.

Benchmarking testing

UX benchmarking is the process of evaluating a product or service’s user experience by using metrics to gauge its relative performance against a meaningful standard.

Source: Alita Joyce, 7 Steps to Benchmarking Your Product’s UX

  • UserZoom - 15.38% of question respondents
  • UserTesting - 12.82%
  • Miro - 5.12%
  • Baymard - 5.12%
  • Looker - 5.12%
  • Typeform - 5.12%
  • Pendo - 5.12%
  • Internal tools - 5.12%
  • Microsoft Excel - 5.12%
  • Figma - 5.12%
19.02% of participants answered this question.

Usability testing

Usability testing is the practice of testing how easy a design is to use with a group of representative users. It usually involves observing users as they attempt to complete tasks and can be done for different types of designs.

Source: Interaction Design Foundation, Usability Testing

  • UserTesting - 46.34% of question respondents
  • UserZoom - 14.63%
  • Maze - 12.19%
  • Lookback - 7.31%
  • UserZoom GO - 6.09%
  • UsabilityHub - 4.87%
  • Optimal Workshop - 2.43%
  • Loop11 - 2.43%
  • PlaybookUX - 2.43%
  • User Interviews - 2.43%
40% of participants answered this question.

Accessibility evaluation

Holistic assessment of the content, design and code against a set of the most common impairments, aiming to highlight the areas that users are most likely to find challenging.

Source: Tina Remiz, How to do a UX accessibility evaluation

  • Fable - 12.9% of question respondents
  • Axe - 9.67%
  • Google Lighthouse - 9.67%
  • Internal tools - 9.67%
  • UserTesting - 6.45%
  • Third party - 6.45%
  • Expert review - 6.45%
  • WAVE Web Accessibility Evaluation Tool - 6.45%
  • Jaws - 3.22%
  • Screen reader - 3.22%
15.12% of participants answered this question.

Competitive analysis

A competitive analysis provides strategic insights into the features, functions, flows, and feelings evoked by the design solutions of your competitors. By understanding these facets of competitors’ products, you can strategically design your solution with the goal of making a superior product and/or experience.

Source: Jill DaSilva, “A Guide to Competitive Analysis for UX Design”

  • Miro - 21.05% of question respondents
  • Google Sheets - 18.42%
  • Figma - 15.78%
  • Microsoft Excel - 13.15%
  • Baymard - 7.89%
  • Notion - 7.89%
  • Third party - 7.89%
  • Confluence - 5.26%
  • Google - 5.26%
  • Google Drive - 2.63%
18.54% of participants answered this question.

Heuristic evaluations

A heuristic evaluation is a method of inspecting and evaluating the usability of a website, or product. You may also hear it referred to as a “usability audit” or an “expert review”. Using a set of heuristics, one or more experts will evaluate how well a product complies to these heuristics to define its usability. Often they’ll leverage a score-card, or numeric-based scoring (weighted to the impact on usability) for each heuristic analysis.

Source: Matt Rae, “How to Use Heuristic Evaluations to Improve Product Designs”

  • Google Sheets - 30.3% of question respondents
  • Miro - 15.15%
  • Microsoft Excel - 15.15%
  • Figma - 9.09%
  • Google Slides - 6.06%
  • Google Docs - 6.06%
  • Expert review - 6.06%
  • Google Drive - 3.03%
  • In-person - 3.03%
  • Microsoft Teams - 3.03%
16.1% of participants answered this question.