Video Editing

Powerful media preparation and focused editing for modern workflows.

Summary

This video editing suite addressed the challenges of media preparation and editing with a streamlined, direct experience focused on 4K, Sony RAW, and HD workflows.

Based on previous research and conversations we had with editors, we knew that audio and video effects were a core part of the editing workflow.

With the redesign efforts we saw an opportunity to answer how we might help our personas use and manage video and audio effects.

MY ROLE

Senior Product Designer (UX) and Researcher

AT A GLANCE
USERS:

Professional video editors; 20 user interviews and contextual inquiry with 5 users

TEAM + PARTNERS:

Product Design team within Engineering; tested with stakeholders, 6 internal team members, and 10 users

METHODS:

Proto-persona workshop, interviews, contextual inquiry, sketching, IA/content modeling, high-fidelity comps, moderated usability testing, iterative refinement (60+ iterations)

The Problem

The existing effects workflow was hidden behind a small icon on a timeline event, which made it easy to miss and hard to learn.

That hurt usability, slowed down editing, and increased the risk that users would assume the product lacked capabilities they expected from a professional editor.

Needed to support a wide range of editor needs, from solo freelancers to larger productions, where plug-ins and color workflows are central to getting work out the door.

The Outcome

Improved discoverability of effects by moving from a hidden icon to a consistently visible entry point, reducing “missed feature” risk during onboarding.

Faster application of effects to events through fewer clicks and clearer affordances. ~30% reduction in median time-to-apply in moderated testing.

Better scalability for complex projects by prioritizing space for plug-in lists and effect chains, reducing scrolling and context switching.

Reduced design churn and rework by aligning early on personas and a shared effects model, leading to smoother stakeholder reviews. Fewer late-stage reversals during sprint planning

Created a reusable content model and decision record that engineering and design could reference, improving consistency across related features.

Moderated tests demonstrated comparable or better task success than alternatives, with fewer clicks, supporting stakeholder confidence to proceed with the space-prioritized design.

Discovery

UX

Research

During discovery, I conducted 20 user interviews and ran contextual inquiry with 5 users.

Interviews were conducted in person or via Skype. Contextual inquiry sessions were conducted in person.

For this project, we wanted to learn:

  • What types of jobs do you take on?
  • What are your primary goals when using a video editor?
  • How complex is a typical project?
  • Do you collaborate with others at work?
  • If so, why do you need to collaborate?
View Research >

Ideation

Outcomes

Key insights included:

  • The types of work editors took on varied significantly.
  • Not all editors faced the same challenges.
  • Completed videos for commercials, sporting events, weddings, and feature-length films were delivered across multiple media formats.
  • Editors expected an all-in-one tool that helped them achieve their creative vision.
  • Projects could be incredibly complex, with many video and audio tracks.
  • Collaboration mattered for a smaller subset of editors.
  • Many users were freelancers with no employees.
  • On larger productions, collaboration was more common because roles were more specialized.

Color grading was often treated as a workflow of its own.

Users expected the product to support plug-ins, both as a plug-in host and through plug-ins that helped them push their ideas further. We wanted to support that.

Information Architecture

Based on conversations with stakeholders and users, we started by creating a content model of all the places an effect could appear.

This meant examining the timeline, tracks, and events, and understanding not only what editors needed to accomplish with effects, but also where they would look for effects.

I started with mind mapping: what properties did timeline, track, event, and media level effects share? I mapped out everything I could think of.

These mind maps captured how I organized my thinking after reviewing the research and before moving into sketching.

I shared this with the team for feedback and incorporated their comments.

Next I began sketching potential approaches.

Wires and Comps

View Proposed Versions

Proposed Versions

From the sketches, a few ideas stood out more than others.

We moved into full high-fidelity comps for deliverables, which is what you see in the proposed versions.

These were the strongest contenders. They leveraged existing patterns in the app.

We created these high-fidelity wireframes to test with stakeholders and users. They went through two sprints and more than 60 iterations.

Testing included 6 internal team members and 10 users. All sessions were moderated, with a mix of in-person and remote testing.

These designs were created to evaluate the usability of the proposed approaches.

Key question:

Can you apply an effect to an event?

Current Version

This reflects what exists today with minor differences.

Maximizing available space for the plug-in list and plug-in chain was a high priority.

At the time, the thinking was that this approach would boost space usage and improve discoverability because effects lived in a dedicated tab that stayed visible.

We built three prototypes and tested them. This version required fewer clicks overall and had task success rates better than the comparable approaches.