Joshua Porter (@bokardo)
Doug Bowman is the lead designer at Twitter. A few years ago Doug wrote a blog post about leaving Google, where he was hired three-years prior. He was really the first real designer hired at Google. There are very few hires like that who get so much press and people talking about it. Doug’s post was a post that he was leaving Google, and usually when you write these sorts of posts they follow the same type of formula: this is my last day, I’m sorry about leaving, and my co-workers are great. However, Doug’s post was nothing like that at all: “Unfortunately for me, there was one small problem I didn’t see back then.” And that’s the problem we’ll talk about in this session: 41 shades between each blue (the best they found was HEX #2200C1) to see which one performed better, and an argument over the width of a border (1, 2, 3px). Bing was/is using a different HEX color, which their UX Manager estimated being a loss of $80 million dollars. The data testing culture becomes a crutch for every decision, paralyzing the company and preventing it from making any daring design decision.
Spectrum of Design
- Follow other people’s practices
- Trust your gut
- Everything is tested by small numbers, and small variables
- This process is very slow
- You rely on data for decision making
- You don’t trust your gut
Imagine that Your Design is a Mountain
Your existing design is on the side of a little mountain. With an engineer’s approach you can only get to the top of that little mountain. An engineer quickly gets to a diminishing return as they can only go so high with the current design approach. It’s really dissapointingÂ because you’ll eventually hit a ceiling where you cannot go any higher, known as a “Loci Maxima” in calculus.
Our goal is to be at the top of the largest mountain where our goal is the best (or even just a better) design.
- Optimization asks: What works best in the current model?
- Design innovation asks: What is the best possible model?
What are Metrics?
Metrics are simple numbers that measure the effectiveness of your business.
- Metrics reduce (but don’t take away all) arguments based on opinion.
- Metrics give you answers about what really works. They can also lead you down a rabbit hole, but if you do testing and you have valid data they can give you answers about what really works.
- Metrics show you where you’re strong as a designer. They also show you where you’re weak as one.
- Metrics allow you to test anything you want. Metrics actually empower you to try anything, where as before you’d have to sell someone on a crazy idea.
- Clients love metrics.
Principle: Your metrics will be as unique as your business.
Vanity Metrics are thing such as old-school graphical hit-counters, which you shouldn’t rely on.
The Usage Lifecycle
- Trial/beta User
- Passionate Customer
- How much did it cost to gain your customer? (Cost per Acquisition, CPA)
- If your CPA is higher than their life-time earnings, then it’s not worth it.
- Comparative Metrics: knowing where users came from, and their cost and outcome based on those.
- The best acquisition outcome is still from Email Lists.
- Page views
- Unique Visitors
- Returning Visitors
- Registered Users
- Time on Site
- Daily Active Users
Cohort Analysis: Engagement over time. For instance, the number of customers remaining after every month from sign-up. This is very valuable.
Net Promoter Score: How likely is it that you would recommend our company to a friend or colleague? (ligart 1-10. 1-6 is “Detractors”, 7-8 is Passives, and 9-10 is Promotes).
Mint.com: “Maybe we didn’t have a high viral coefficient score, but we had a great satisfaction metrics.”
Having friends inspires continuous use.
“Find the people you know” is a good example of this.