We got POP - Search feature

An in-depth research on improving the search functionality of the platform

Platforms: Web application

Toolkit: Guerilla user testing, user surveys, user journeys, heuristic evaluation, comparative study.

The challenge: Perform UX research and suggest improvements on how search works

Cover of the study

The 'We got POP' platform is a browser application that allows Assistant Directors and Casting Agents to cast, book and pay extras effortlessly for any project. One of my tasks was finding ways to improve “POP Precision”, a search functionality which allows users to find best-fit extras for their casting needs.

My part in this project was to conduct all the necessary research and evaluate the end-to-end experience of the primary user base as they search and book artists.

Note: The following are excerpts from the study

Purpose and Scope

This purpose of this study was to evaluate users' interaction with the search and selection of suitable extras for each project.

Collecting this data would provide us with:

Methodology

For a full analysis please read the full study.

Comparative assessment

A comparative assessment allows us to explore how other websites and platforms handle the search process. Albeit not direct competitors of our platform these are systems that are used by a big number of users.

We often find that users benchmark their current user experience with products they have used in the past. It is also these same past experiences that have taught and educated users on what to expect and how to interact with commonplace user interface elements.

Research Study

The study collected qualitative and quantitative data to answer several research questions, including:

Study Design

One part of the study was attitudinal research, conducted by an online survey directed to our current in-house and client users. Collected data was quantitative (“closed questions” - i.e category ranking metrics) and qualitative (“open ended questions”- i.e data including participant comments).

For the second part of the study I conducted behavioural research in the form of an in-person usability study using recording software. In order to gather insights into user performance and unmet needs.

The study collected information such as task completion rates, time on task, navigation and content insights, overall satisfaction, areas of concern, and unmet needs.

User Testing Audience - I conducted the study with 5 participants who represented a significant spectrum of usage behaviours. I recommended testing with this audience because they were suitable for the age-bracket and average activities of our AD personas. The number (5) of the participants is recommended for recurring qualitative testing.

Heuristic Evaluation

After testing, I followed up with an evaluation method based on established usability principles as they are defined by the Nielsen Norman Group. These 10 usability heuristics help identify core problems within digital products and ensure optimal usability at a base level.

This evaluation was conducted to help identify issues in the platform that did not conform to universal user experience standards and basic principles.

Outcomes

The study provided us with:

You can find the study in full here:

Back to Case Studies