Putting Data in Design
BY RYAN UNDERWOOD
Want to develop a product that customers will love? Take a cue from Internet companies and do some testing. A lot of testing.
Want to develop a product that customers will love? Take a cue from Internet companies and do some testing. A lot of testing.
Designing an innovative product requires impeccable taste, sharp instincts--and, of course, good data.
Taking a page from the tech industry, product makers and retailers are employing A/B testing, a technique frequently used to refine websites. Traditionally, that involves showing users two slightly different versions of a webpage to see which one drives more sales. For example, Version A might have a button that says Buy Now, while Version B says Get More Information. For online developers and marketers, these sorts of experiments have become the industry standard. A survey from MarketingSherpa finds that, of the online marketers who measured return on their A/B testing, 81 percent reported a positive return on investment.
Now, companies that make physical stuff are using these tests to determine what customers want and how best to get them to buy. Crowdery, a Y Combinator-backed start-up based in San Francisco, is working on a widget that would let retailers collect data on which potential products customers prefer. Crowdery’s technology is still in beta testing, but the process can be as explicit as asking consumers to vote on a favorite shirt style in hopes of scoring a presale discount if the item ultimately gets made. Or Crowdery’s code can lurk silently in the background, walking users through a typical transaction before informing the customer that the item is not yet available.
Founder Maran Nelson came up with the idea after working with a company that makes and sells backpacks. At the time, the founder of that business was worried about investing time and money into manufacturing designs that might prove to be unpopular with customers. “We started seeing that there was this pain point for retailers,” says Nelson. “Ultimately, you have an industry making huge financial decisions in a very inelegant way.”
Unlike traditional focus group participants, customers in these sorts of A/B tests often believe they are about to purchase a product, which makes the feedback more valuable. For instance, Julep, a Seattle-based cosmetics start-up, tested demand for a new nail-polish wand by taking out several ads on Google. One ad presented the new IDEO-designed wand as a tool for sophisticated color mixing. The other promised results similar to those at a professional nail salon. Overwhelmingly, people clicked on the ad touting the professional-salon quality, says founder and CEO Jane Park. She expects to start shipping the gadget in May. Because of the results, she’s now considering offering the wand’s color-mixing attachment as a separate product.
In addition to ads, Julep regularly taps customers for input, including polling its Idea Lab, a group of 5,000 customers who weigh in on early prototypes. These sorts of tests help speed up the development cycle and validate demand for an item before it hits the market, says Park. Even small tweaks made with feedback from customers--whether it’s a slightly different nail-polish formula or an improved package design--can make a big difference in sales. So that input is invaluable, says Park, even if it occasionally proves her wrong. “I have a disagreement with my creative director almost every day,” she says cheerfully. “But there’s a simple way to settle any argument: We take it to the people.”
Here are three rules for making the most of your customers' dynamic feedback:
1. Ask the right question. Don’t waste your time testing small tweaks. The choices you’re asking customers to make in an A/B test should be different enough for your audience to notice. The bigger the difference, says Robert Moore, a statistician and the CEO of RJMetrics, the fewer people you need to poll to produce statistically meaningful results.
2. Simulate real life. There’s a big difference between paying people to participate in a focus group and having them actually think they’re about to spend money on something. You’ll get a more accurate reflection of customer demand when people believe they are being asked to open their wallets.
3. Don’t become a slave to the numbers. In the same way that politicians shouldn’t govern by opinion polls alone, leaders should avoid making decisions on data alone, says Jane Park of Julep. Just because something does well in an A/B test does not guarantee it will be a hit in the marketplace.
Posted in: diseño de producto,pruebas A/B
0 comentarios:
Publicar un comentario