Auditing Overstock.com: tools, people, & patterns

Last updated

A capture that I made auditing Overstock.com while working on the 2-person Design Systems team (supporting 20+ UX Designers).

The setup

There’s a lot of communication that happens around design systems, but one of the first things that needs to happen is getting an inventory of what patterns currently exist.

Working within a newly-formed Design Systems Team at Overstock.com, I was part of a duo tasked with supporting 20+ UX designers. An e-commerce site launched in 1999 — so there were many patterns.

One of my first tasks was to identify the current components, discrepancies, & mobile variation patterns across the user experience. But before touching any of that, I needed to understand the system that created them.

Observe the system first

Before making suggestions, take a few weeks (or months in larger-orgs) to really experience the communication patterns throughout the org (tools + how people operate).

(Note: if you need to knock out auditing design patterns right away, Brad Frost’s interface audit is a great resource if you want to jump in & start putting screenshots into Keynote.)

Whimsical board mapping out the team's current tool subscriptions & their overlapping features.

I used Whimsical to map out what the team was currently paying for & the tool’s features.

Let’s imagine you’re on a team of 15 ux designers, split across several teams along the purchase journey.

After a few weeks, you’ll get an idea of how people do things.

Mind map of all the components in the Sketch library, organized in MindNode.

An overview of the components in the sketch library in mindnode. I like using different tools & this one is a solid native mac app for mindmapping.

InVision usage data showing one account generating 90+% of the team's projects.

Before making too many waves, I contacted the main users of the existing tools that would be impacted by any contract changes. (note: it was only one account generating 90+% of the projects).

Slack channels & comms setup for managing the transition between old & new design tools.

Supporting a new tool means handling the old tool comms & creating channels that support & encourage the new tool usage & adoption.

Start small, think big

I wanted to be able to show some value right away, and in the case of e-commerce, that’s going to be the PDP (Product Details Page) (what’s nice, is that in theory you’ll eventually get to everything).

Start wherever the most used part of your app/service is, which will likely contain a lot of the main components.

Design tool + browser

Using Figma and Chrome DevTools, I would start the process of interacting with the page & documenting the patterns.

After a while, you start to get into a rhythm. Having the same URL in two tabs, one in desktop view and one in mobile, then capturing component-by-component ended up being a favorite flow of mine.

It ensured that if I had to stop for something, I would have the component captured on desktop and mobile. I err on the side of desktop and mobile being ‘complete’ rather than documenting all patterns found in a certain view as complete (e.g., ‘desktop complete’). We are going component-first instead of viewport-first.

A Figma library of macOS cursors for documenting accurate hover-states in audit captures.

Preview of the cursors figma library that helped showing accurate cursor hover-states.

Capture everything, annotate as you go

Close-up of Figma annotations capturing menu patterns & option-selection components from the live site.
A peek at some close-up pattern capture around menus and option-selection.

Screenshots & annotations in Figma were pivotal in our audit. Visual documentation is invaluable — it serves as a clear reference point that transcends subjective interpretations.

Some metadata I’d include:

What I found (& why it matters)

Zoomed-in comparison of two nav menu hover interactions with completely different styling patterns despite being pixels apart.

Zoomed in around the menu hover interactions. Even though they were a few pixels away on the nav, they were miles apart when it came to styling patterns.

It’s possible to quantify aspects of a ‘poor experience’, as defined in one part as having inconsistencies in the experience. It might be interesting to get the number of buttons that have a unique shade of blue, but what’s more interesting is understanding why those inconsistencies happened in the first place.

There’s an infinite amount of possible scenarios that might cause a bunch of unique buttons to show up on a product. The chances of guessing the right cause is slim, so it’s best to involve people that have insights into areas you don’t.

Alignment instead of blame

The Overstock design system page alongside a Paper doc with notes on the team's values.

The Overstock design system page & the Paper doc I had with notes around the team’s values.

Capturing the product experience in a visual way can help build alignment & allow for discussions around patterns & current implementation. Remember to include some helpful metadata that lets you get to whatever you were testing quickly (e.g., full url, date-test, environment, etc.).

The Sketch library file & its preview image in the library panel.

A peek at the sketch library & the library file preview, scroll* down memory lane with this howto article (external link)

When it comes to ‘issues’ or inconsistencies in the experience, it’s usually a manifestation of a human relation dysfunction somewhere along the creation line. Being mindful that there’s a human relations factor underneath the ui conversations can help bring a gentler tone.

Conversations can get muddy because people start getting design components vs. code components vs. words-only components blurred, which can create miscommunication. Stepping back & making sure everyone is aligned on what type* of component we are talking about can save some confusion.

Just remember — design components are an abstraction. It’s what’s live on production that matters.

What this led to

The audit work fed directly into a few things: consolidating the team’s design tools from 3 down to 1, building out an MVP design system in about 6 months (with limited dev resources), & creating reusable component guidelines that the 20+ designer team could actually use day-to-day.

But maybe more importantly, the process of going through the product built a shared understanding across the team about where patterns diverged & why. That made the conversations about fixing things a lot less contentious.

A dirty job (& why it can’t be automated)

There seems to be a desire to automate things, especially if it’s a process that naturally takes time to complete. For example, clicking through a site/app.

It takes time for a human to click through & experience whatever is being presented on their screen. I’m not sure of a better way to design for an experience than to go through it yourself.

Even if there are tools that can automate audits & capture components to document, there’s still the human experience of interacting with the product.

There’s a difference between reading about a 900ms blocking transition that’s happening on a key component versus having to sit through the experience.

When you go through the product & start capturing patterns, you begin to develop a broader ‘sense’ of the system. Hearing about changes, you might be able to predict how the system might respond & ask questions accordingly.

It’s tedious, time consuming, & 100% worth it.

👈 back home