O SlideShare utiliza cookies para otimizar a funcionalidade e o desempenho do site, assim como para apresentar publicidade mais relevante aos nossos usuários. Se você continuar a navegar o site, você aceita o uso de cookies. Leia nosso Contrato do Usuário e nossa Política de Privacidade.
O SlideShare utiliza cookies para otimizar a funcionalidade e o desempenho do site, assim como para apresentar publicidade mais relevante aos nossos usuários. Se você continuar a utilizar o site, você aceita o uso de cookies. Leia nossa Política de Privacidade e nosso Contrato do Usuário para obter mais detalhes.
Presentation given my Jason Cooper, Director of Value Added Services for Efficient Frontier at PubCon in Las Vegas on November 11, 2009. This was presented as part of the Multivariate Testing and Conversion Tweaking panel.
I consider myself a designer first and foremost. I’ve been designing websites for a little over ten years now and over that time, I’ve seen a noticeable shift how website and landing page design is done. We’ve moved from incorporating technology that answered broader usability questions like “how do I provide access to 30 different category pages in my navigation?” to focusing on how to meld these technologies with design and data to maximize customer through-put. We’re moving into an era of Performance-based design and the one of the primary principles of this movement is the understanding that we can no longer assume that what we think as designers is best for users. And for the managers in the room, you can no longer sign off on designs just because they look good. The web is the most measurable and testable form of design in human history, so it is inevitable that we’re moving towards this data-driven design. This slide is more relevant this year that it was last year and it will probably be more relevant next year than it is this year. So, alright, what exactly is this slide? What’s the point? This was Yahoo in 1996. To help illustrate my point that the web is evolving into a data-driven design model, I thought it would be interesting to see what’s changed about Yahoo over the years, because talk about a company that has a lot of data on user behavior and the potential to benefit tremendously from it. 97 – they went to two columns (hardly data driven, but it did allow them to fit more above the fold) No real major changes then again until 99 when they added this third column for news and other items. 2000, they shoved even more links and content on the page with the finance box and more buttons across the top. So, what were they really doing during this time? They were just shoving as much content into a limited amount of space as they could. In 2002, we see the first hint of a decision made from data and it’s not these pretty tabs on the side. Anyone want to take a guess at what it is? The directory links are no longer arranged alphabetically. Someone finally decided to order them by usefulness. From there we continue to see things shift around until 2006 when the Internet Archives crawler wasn’t able to support whatever new technologies Yahoo implemented and then we end up with what we see today, where you see things like Popular Searches which is an up-to-minute, data-driven usability feature.
So, how did we get here? What are the factors that are quickly driving this shift into performance-based design. The biggest ones that I see are that technology on the web is maturing, which has given designers and developers a chance to breath and master all of the innovations like AJAX and video and Flash and mash-ups. We’re no longer spending all of our time trying to learn the latest technology and instead we’re able to focus on how to best implement that technology from a user perspective. Broadband penetration and even to some extent 3G networks for mobile broadband access has untied our hands and allowed us to use all these technologies freely. The down economy has made getting more for less a mantra of any organization and frankly is a lot easier and cheaper to double your conversions than it is to double your traffic. And then probably the biggest contributing factor is the availability of free technologies like Google Analytics and Website Optimizer which made testing and analytics accessible to even the smallest business owner.
So what does it mean? It means a growing awareness of and in testing…perhaps that’s what brought some of you here today. It means that greater ROI is going to be sought through post-click optimization than through acquisition channels. And it’s going to mean a hunger for more data through new and innovative methods of collection and segmenting. And what better indication of this movement to data-driven design than the marriage of Adobe and Omniture.
So what specifically should you test? The simple answer to that question is everything, because remember one of the core principles of performance-based design is that we can’t assume that what we think as designers is correct. By testing everything, you always have an answer to the “why” question. What that headline? What that blue? The challenge is really knowing where to start, because if you were to literally test everything, especially at once, you’d have a giant mess on your hands. I divide this question of what to test into two camps: broader design decisions and the “tweaking” The broader design decisions are the types of questions and concepts that are more related to how we think when we’re designers. These are big picture concepts like first impressions and page layouts. In testing, these are the types of problems that you might address with a radical redesign of a page vs simply tweaking elements of the page here and there. I find these things to be more appropriate for A/B testing and what you’re really looking for here is “what’s sticky? What keeps the users on my page?” On the tweaking side, you’re dealing with the type of questions and concepts that are more closely related to how we think when we’re marketers. So now that we’ve captured the visitor’s attention, what do we say to them. So these are things like specific messaging in individual elements like headlines and buttons where you identify one page element that might have 10 different viable alternatives. 10 different headlines or hero shots or incentives. These are the types of things that are better for multivariate testing and here you’re looking for what resonates with visitor. An important thing to remember is that to make really big improvements in conversions, sometimes you need to make really big leaps in page design…tweaking isn’t always the right place to start.
Since this session is about multivariate testing and tweaking, I wanted to point out the basics of what you should be testing in a multivariate situation. These are sort of the bare minimum that you should be testing and these are generally part of any good landing page. If you take nothing else away from this, at least understand that these items will likely have the biggest bearing on your conversion rates. You have your brand presentation. People need to trust your brand and you have to figure out the best way to make that happen. I call it brand presentation because maybe it’s as simple as a small logo, but maybe it requires a bold, pronounced logo with an a brief intro paragraph that conveys the value proposition. As a general rule, the bigger the brand name, meaning the more well known it is, the smaller the brand presentation needs to be on the page. If you’re selling a product, how you picture that product on that page can have a big impact on the visitors’ perception of the product. A perfect example of this is for downloadable software. Pages that actually show a box or a CD to represent the software usually perform better because they give the visitor a sense that they are getting some thing and that adds value in their mind. Similar circumstance with the hero shot. It’s one of the quickest way to connect with your visitor. Obviously they want to see themselves. Buttons. Buttons have major influence over conversions. If you have a button, that’s probably your conversion point. When a visitor clicks that button they want something to happen, they want to get one step closer to fulfilling their desire, so utilize that button to draw attention and use the text in the button to tell the user what to expect when they click it. You should never user a standard browser button, it should always be an image button and it better never say “submit” on it. Every landing page needs a call to action. Buy this, call us, fill out this form. Some work better than others, so test which works best for you. Incentives are how you get people to take action. A sale, free shipping, a limited time offer. Figure out which works best, although I can tell you it’s probably free shipping if you’re selling a product that needs shipping, but that might not be enough. Test the process. The process is the transparency or instructional quality of getting from landing on the page to converting. The ease of use. This is especially important in shopping carts. And then there’s the headline. Headlines are vitally important. It’s your 1 second to convey your message.
Here’s an example of a headline test by 37 Signals, which they wrote about on their blog, where simple headline testing increased their conversions by 30%. So the difference between “Start a highrise account” with the associated subheading and “30 day free trial on all accounts” was 30%. Imagine if you were bringing in 10,000 visitors a day, what a 30% increase would mean to your business
Now we’re going to back up a little bit, because even though we’ve gone over the basics of what should be tested and had an overview of the technology that allows you test, a HUGE question still remains: “How do you develop the variables for your test?” So, in other words, you know you should test your headline, but what are the different messages you should test in that headline. What are the different demographics you should represent in your hero shot? Here again, we get back to the principle of “ We can no longer assume that what we think is right is best for users.” So here is where a diverse source of data inputs comes in to play. You should consider most of this as the groundwork you need to do before you start testing, because all of these elements should be incorporated into your landing page design and test design. There are two types of data to consider: Quantitative data is the stuff that the numbers tell us. Yours site analytics can tell you where visitors on your site enter from, the pages they visit while they’re there and the pages they exit from. It can probably also tell you important things like what country they were in when they visited your site and what language setting they have their computer set to. It can also tell you what keywords they were looking for when they found your site. Paid search can tell you what messages visitors resonated best with through ad copy testing. And on-going test results will constantly refine your design and messaging. The other type of data is Qualitative and I think this is often overlooked, when actually it can be the most important kind of data to take into consideration. Qualitative data tells you about your customers in a way that has emotion and attitude and some raw honesty. Things like customer feedback and what your customers (or detractors) are saying about you in social media can give you amazing insight into what people like and dislike about your brand vs other brands. If you can identify people’s misgivings, you can directly address them in your messaging. And then there are personas, which I believe Vanessa is going to cover.
There’s a caveat to looking at all of the data. Beware of aggregate, or average, data. It is very easy to fall into the mode of looking at the big average numbers at the top of a report in Google Analytics and say “hey, our bounce rate went down since we started doing our test. That’s great, we succeeded.” Keep in mind that Plans based on average assumptions are wrong on average. Here’s an example where the average bounce rate is about 54%, but does that mean you should ignore that fact that your display campaign is bouncing at nearly 83%. No. Segmentation of your data is important to truly understand out the test elements influence individual users.
How should you segment your data? This really depends on your business goals and the tools you have available to you. For your business, you have to review your data to find out what matters. I tend to look at data in these buckets. Demographic data is the persona related stuff. Does gender or age or a visitors language effect their conversion. Am I even catering to the right demographics. I always break out traffic by source, because different channels perform very differently so I don’t want to lump what social media visitors are doing on the site with what paid search visitors are doing. You have to understand where there is value and where there isn’t. Do visitors from social media really convert enough to make a decision that would favor them but adversely affect your paid search traffic. And this is a hard thing, getting to understand what types of visitors are valuable to you and which aren’t. And then lastly I look at psychographic data if it’s available. So, what point in the decision making cycle are they in, because I want much stronger messaging and incentives for someone who’s motivated to buy a product versus the softer, informational approach I’d prefer to take with someone still researching their options. This is really the realm of retargeting which is a whole other panel.
Another thing you should do with your data is combine it to get more meaning, especially by layering in qualitative data with quantitative data. Typically quantitative data can’t tell you “why” something is happening. Analytics can’t tell you why so many people are bouncing from a particular landing page, but if you got the opportunity to get customer feedback, you would know. You can do things like take your personas and use them to create meaningful segments in your analytics, so that you can see what Jim is doing on the site different than what Jill is doing. You can look at your test results and see that certain incentives or headlines are underperforming and turn to customer feedback and social media to see what people are saying about that could give you insight.
The goal with all of the segmentation and data combination is finding relationships. So, slice and dice that data anyway you can. Look for cause and effect corellation, something that you can consistently say because of A, B happenned. Don’t forgot to look at data on your competitors, sometimes that can provide the missing piece of the puzzle. Maybe they are simply offering a bang up incentive to buy their product and if you can’t compete with that, then testing just isn’t going to give you big returns or tell you what those granular synergies are, because one core component of your offer is effecting the entire process. Lastly, utilize third party tools to add more color to your data. This is where I think the next big wave in this movement is going to come from. Companies will start developing tools that allow them to overlay feedback and social media with their analytics and there will be big money in doing this.
Tools like twitter spectrum, which you can Google, allows you to put in two keywords and see the common keywords used between the two. So I put in two competing online universities and you can see from the word usage, that capella tends to have more positive words associated with it and I see something that stands out here…the word biotech. If I was kaplan, I’d wonder if capella was offering a biotech program that maybe we should start developing to compete with, but I need to qualify that data.
So on Facebook lexicon, I put in capella and biotech and see if there is a correllation there with the chatter and sure enough there is, so now I’m really interested in this if I’m kaplan. This is the fun kind of activities you can do to dissect data and find insights that you wouldn’t get just by looking at your analytics or test results.
One of the best things you can do for your business, is build a culture of testing. A perfect example is the Google’s 41 shades of blue fiasco that some of you may have heard about. Basically there were two different camps at Google that couldn’t agree on a shade of blue and so a Vice President had to step in and decide the manner, which later led to them testing 41 gradations of blue to find the right one. The problem with this isn’t the testing of 41 variables, it’s the fact that people couldn’t agree on something and had to rely on a VP to solve their problem. A culture of testing would fix that problem. Involve all of your business units…it qualitative data to have Bill from shipping’s input and someone from marketing probably has a good idea that someone from creative can turn into a brilliant design. At Efficient Frontier most of our clients set aside a portion of their budget for testing everything from new keywords, to ad copy and landing pages. This is indicative of a culture of testing. Remember that data can breed creativity. Being data-driven actually allows you be take risks. It’s a safety net that allows you to make those big leaps. If you have the data to back the reason behind your decision, then your safe, even if the test fails.
Performance-Based Design We can no
longer assume that what we think as designers is best for users. Nov. 1996 Oct. 1997 Nov. 1999 Oct. 2000 Nov. 2002 Nov. 2003 Nov. 2004 Nov. 2006 Today
<ul><li>Technology on the web is
maturing </li></ul><ul><li>Broadband penetration </li></ul><ul><li>Down economy has made getting “more for less” a top goal </li></ul><ul><li>Availability of free technology </li></ul>How Did We Get Here?
<ul><li>To find the granular synergies
that persuade visitors you need complex statistical analysis </li></ul><ul><li>Available tools remove the complexity so you can focus on what to test not how to test </li></ul><ul><li>Variety of options and costs </li></ul>The Technology
<ul><li>Simple Answer = EVERYTHING </li></ul><ul><li>Challenge
is knowing where to start </li></ul>What Should You Test? <ul><li>First impressions </li></ul><ul><li>User-experience </li></ul><ul><li>General layouts </li></ul><ul><li>Radical redesigns </li></ul><ul><li>Better for A/B </li></ul>When we think like designers <ul><li>Specific messaging </li></ul><ul><li>User-experience </li></ul><ul><li>Individual elements </li></ul><ul><li>Variable alternatives </li></ul><ul><li>Better for multivariate </li></ul>When we think like marketers What’s sticky? What resonates?
<ul><li>It depends on your business
and the tools you have available </li></ul>How Should You Segment Your Data? Age Gender Language Country Connection Speed Paid Search Natural Search Social Media Affiliates Keywords Research Mode Consideration Motivated Repeat Visitor Loyal Customer
<ul><li>Qualitative data can provide flavor,
texture, and nuance to quantitative data </li></ul><ul><li>Typically quantitative data can’t tell you “why” </li></ul>Combining Data Types + = Meaningful Segments + = Understand The “Why”
<ul><li>Slice, dice, and compare data
any way you can to find relevant relationships </li></ul><ul><li>Look for cause and effect </li></ul><ul><li>Data on your competitors can provide the missing piece of the puzzle </li></ul><ul><li>Utilize 3 rd party tools to add more “color” to your data </li></ul>Find the Relationships
<ul><li>Google’s “41 shades of blue”
</li></ul><ul><li>Involve all of your business units </li></ul><ul><li>Set aside a portion of your budget for testing </li></ul><ul><li>Data can breed creativity—allows you to take risks </li></ul>Build a Culture of Testing
<ul><li>Doubling your conversion rate is
far easier (and cheaper) than doubling your traffic </li></ul><ul><li>Get to know your data (all of it) intimately so you can uncover relationships </li></ul><ul><li>Segment and combine your data for more meaningful interpretation </li></ul><ul><li>Test everything until you know what matters </li></ul><ul><li>Develop a culture of testing </li></ul><ul><li>Evolve through continuous testing and improvement cycles—become data-driven </li></ul>Take Aways