Process Driven

20150824_footage_clicakble08copy

Background:
When Shutterstock contributors submit their photos, vectors and videos, the content needs to be reviewed by a person.This process involves the use of the Review 2.0 platform, for photos and vectors. However, the video review process was conducted in-house on a separate platform. As video has become an important part of the business, Shutterstock wanted to move the video review to Review 2.0 and in the process update the Review 2.0. It is important to not only update the technology so that we can enable more reviewers outside of the office, but also improve the design experience with the aim of improving reviewers’ current work flow.

Role:
I was the main UX researcher and designer on the project. I identified the expectations, timelines and deliverables. User research sessions were planned with reviewers followed by two rounds of user testing. During this time, I also conducted a workshop with the review team to understand their process and challenges. I shared all the findings and analysis with the stakeholders in the company and communicated with the developer team to create an improved solution.

What did we do and how:

Human Focus: Understanding the footage reviewers and their work
Being initially unfamiliar with the review process, I started the project by shadowing footage reviewers while they were reviewing videos. I also conducted interviews with reviewers in the office. With this initial information compiled, I was able to break down the process into different sections and use post-its to visually layout the different steps that I had learned about.

Gathering information and mapping the process

Information gathering & mapping process

20150724_Footage_Review_DecisionTree-02

Footage review decision tree

Information Focus: Workshops
In addition to shadowing and interviews, I wanted to get a bigger picture of all the footage reviewers process and their pain points. I took advantage of having users in the same building and facilitated a workshop with hands-on activities.
The first exercise was a card sorting exercise. On the card was the different action that the footage reviewers do (which I learned from the interview) and I also had blank cards. The footage reviewers organized card by how they usually do review and shared their process with each other.

I collected them at the end to perform an analysis. While everyone has a different way of functioning, we were able to pin-point some actions that were often performed at the beginning of the review process. I used these findings to guide the interface design.

 

Leading card sorting and discussion

Card sorting and discussion

Tallying the result

Tallying the result

Grouping current process

Understanding the current footage review process

Analysis of the current process

Analysis of the current footage review process

The second exercise involved the reviewers listing their challenges and pain points. Everyone wrote down their problems on the post it and posted next to the action it is related with. Then, everyone voted on the three biggest pain points.

Through this exercise, we realized that for the footage reviewers, ability to be able to see the subsequent videos was important. This opened up a discussion with business and engineers to prioritize the function to see groups of video at once.

Listing out the challenges

Listing out the challenges

Dot voting on the challenge

Dot voting on the difficulties

Analyzing the pain point

Analyzing the pain points

Summary of findings

Summary of findings

Priority Drive: The Requirement Document
I shared the research findings with engineers and business stakeholders and we worked through the requirement documents from the user, technical and business perspectives. This helped us to give priority to the more important functionalities.

Discussion between UX, engineers and business

Discussion between UX, engineers and business

Power To The People: User Testing
Based on the requirement document, I created a paper prototype and ran user research with six footage reviewers to get their feedback. Following this, I was able to conduct two rounds of testing using a clickable wireframe.

Using paper prototype to user test the early design

Using paper prototype to user test the early design

User testing with clickable wireframe
After receiving feedback on the paper prototype, I designed two round of clickable wireframe and tested with the footage reviewers each time.

Result! Final Design
By accumulating feedback from users, engineers and the business team, I was able to design and deliver a final version of the review interface along with specifications to the team of software engineers. The design got was successfully implemented and tested with the footage reviewers.

Single Asset

Batch View