Box view

The Problem

An incomplete picture

During a performance review, managers often do an activity called calibration. It's when managers get together to align on each employee's performance. They discuss which employees meet expectations, exceed expectations, and so on.

Before this project, Lattice's calibration tool was essentially a big table, as shown below. While this tool was functional, customers complained about the inability to see patterns or outliers in this data. As a workaround, some customers would export the data to create charts in Excel—a huge pain.

Lattice's calibration tool before I started this project

OUR Approach

Make it visual

Our hypothesis was that we could improve this experience by making it more visual. In previous research, we had learned that some customers used a framework called “9-box” to calibrate employees outside of Lattice. This framework served as inspiration for one of my first sketches, shown below.

Illustration showing Dropbox branching out into Dropbox Spaces

Early sketch of box view

Discovery

Testing the waters

Although we heard a few customers talk about their 9-box process, we weren't 100% sure if that was the right approach to visualize calibration. After a few rounds of sketching and explorations, we came up with 3 possible concepts that we wanted to test with customers:

Members of the Plan squad

Box View was inspired by the 9-box framework

Members of the Plan squad

Scale View focused on visualizing one question at a time

Members of the Plan squad

Focus View focused on 1 employee at a time to avoid bias

To test these concepts, my Product Manager and I conducted interviews with 8 customers, 5 customer success managers, and 3 in-house HR experts. In the end, we concluded that the box view concept addressed the biggest pain points of calibration. Specifically, that view makes it easier to visualize rating distributions and spot outliers during a live calibration session.

OUR Team

All hands on deck

Once we had a general idea of the approach we wanted to take, we looped in the engineers on our team. To mimic the 9-box activity that some customers did in person, we wanted this experience to be highly interactive. For example, we wanted to support drag-to-drop to put employees in different boxes.

For this project, we needed a strong frontend engineer, so we picked Sebastian (our frontend expert) to lead this project on the engineering side. This project was complex enough that each engineer on our team contributed along the way.

Members of the Plan squad
Designing & Refining

Building box view

Over the next few weeks, I created 100+ mocks and several prototypes for the v1 of this project. Along the way, I collected feedback from customers, design team members, and internal stakeholders until we felt comfortable that our product would meet our MLP requirements. (MLP = Minimum Loveable Product)

Below are just a few of the designs I created for this feature.

A snapshot of my Figma file for this project

The new box view, sitting alongside the existing table view

If certain employees don't have a score, they appear in the collapsable bottom tray

The context panel appears when you click on an employee in box view

Here's one of the empty states in box view

This didn't make it into our v1, but as a fast-follow, we added the ability to customize box labels

We got feedback that avatars could cause bias, so we added an option to replace avatars with names

IMPLEMENTATION

Putting it all together

The following video shows our v1 of box view on the day it launched. Notice the many interactive details, like when dragging across boxes, dropping someone into a new box, hovering over an employee, selecting an employee, and the tray interactions. I love designing interactive details, so it was so satisfying to see all this come together.

Video of Box VIew in action

Impact

Breaking out of the box

We launched the first version of box view at the end of September 2022. In the months that followed, we've noticed a huge uptick in customers using our calibration tool.

For example, in September 2022, only 7.7% of performance reviews included calibration, but by January 2023, that number jumped to 16%! That's over a 100% increase in usage, and we believe box view played a big part in that increase.

Members of the Plan squad

Calibration usage jumped from 7.7% to 16% after box view launched

Reflections

Lessons learned

As of January 2023, we're continuing to iterate on box view, but these have been my key learnings while working on this project:

Leverage existing frameworks

Although I had never heard of the 9-box framework before joining Lattice, we learned through research that this was a common framework used for calibration. We just needed to build a tool that could support this, and the more we learned about the 9-box, the easier it was to design for this use case.

Balance power with simplicity

One of the hardest challenges with this project was being able to support the many variations of the 9-box framework that customers used. For example, some customers used a 4-box, some customers wanted to rename the boxes, and some customers wanted to hide the avatars. When interviewing customers, we had to dig into it which features were must-haves vs. nice-to-haves.

Push for polish

For this project, there were a lot of nuanced interactions and edge cases, and we knew we couldn't just test everything at the end. Instead, I tested the product regularly throughout development, escalating issues soon after PRs were merged. This helped us tackle bugs quickly and helped the end product feel polished.