Landing Pages & Multivariate Testing: 2009 Updates
Good morning and welcome to Search… er, wait, no. This isn’t the SearchWiki panel. Hold on tight everyone because we’ve taken a sudden course change for Landing Pages & Multivariate Testing with panelists Jeremy Crane, TNS Compete, Dan Darnell, Optimost, Interwoven, Inc, Sandra Niehaus, Closed Loop Marketing, and moderator Gord Hotchkiss, Enquiro.
If you’re interested in the originally scheduled panel Google’s SearchWiki, Customized & Personalized Results, head over to OutspokenMedia where Lisa will be blogging up a storm. It’s going to be a very well covered session so I decided to give some love to the PPC track.
Gord says that bad landing pages are where good leads go to die. It’s important to make sure that you understand your visitors’ intent so that they can land on the page that’s targeted to them.
Jeremy Crane is up first and he’s going to teach us how to find that elusive low-hanging fruit. He’s going to be using examples from display campaigns but the lessons should be similar enough for search.
Example: AT&T
He starts with two AT&T campaigns, very similar in look and feel. The campaign for the Family Plan got more than twice as many clicks as the Free Phone campaign. However, the number of conversions for the Free Phone campaign was greater than the Family Plan. What’s the deal? The landing page for the Family Plan just sent visitors to a generic page within the site. The Free Phone page was specific, gave three options with a “post-click” experience that was targeted to conversion.
The traditional flow is: Stimulus -> Exposure -> Response.
However the reality is that there are multiple stimulus, exposure, and response steps. Online you can measure each of these steps. First from the advertising, then the landing page, then into the conversion funnel. Most people skip optimizing the landing page.
Improving performance from the bottom quartile to average or best-in-class translates into a 2-6 time improvement in conversion. Assuming no added spend, increases in conversion are direct increases in ROI.
Not all landing pages drive equal amounts of traffic to online tools. Looking at “build a drive” tools on car sites is a good indicator of intent to buy. In comparing four makers who lost? Toyota had the weakest showing. Why?
Their landing page gives you exactly one thing to do: find a dealer. You can’t build and configure right from their landing page. It’s on a different domain (buyatoyota.com) which was confusing for users. Not a good experience.
Example: Credit Card Applications
Chase and Citi gave calls to action right up front. “Apply now” links were on the landing page. They had 33 percent and 68 percent conversions as opposed to AmEx and Capital One’s 2 percent conversion rate.
How do you go from best in class to uber best in class? Look at your competitors, take their ideas, incorporate them into your landing pages and test them against your own. See if you can get a lift. Even if you don’t get a lift, you’ll learn what doesn’t work. However, the rules change every 6-12 months so you need to be testing continually.
Gord points out that you also need to focus on how invested the customer is in the goal. The less invested, the more on point you need to be.
Dan Darnell follows and introduces Optimost (now part of Interwoven as of 2007) Dan points out that you can’t just test your landing pages, you need to test everywhere — landing, registration, shopping carts, everything — headlines, copy, offers, pricing, for everyone — new, repeat, weekend traffic, email responders, etc.
A visitor makes many mini-decisions on a landing page, not just one. It’s not just the form or the image or the header. It’s all of those things, each tiny decision drives the action. This is where multivariate testing comes in so that you can test many different elements all at the same time. Don’t just pick one in a meeting. Test them out and decide based on what the users respond to.
Example: Qwest
During a 6-week experiment, their KPI was unique click-through rate. The parameters were 4 variable areas, 19 values across areas and 12,986 possible creative permutations. They were looking at the header, the copy, the offer and the image.
Changing the header copy slightly didn’t make much of a difference so they could use any of them and it wouldn’t make an impact.
Reducing the amount of copy had a positive impact, particularly in taking away extra links so that people couldn’t get distracted by other tasks. Less is more.
Changing the offer from “check availability” to “continue” had a negative impact. It wasn’t the right message for the page. Adding a bright red arrow (on an otherwise green and white page) had a positive impact — red means pay attention to me. It helps people focus where you want them to look.
Three changes in images: Two had no impact, one has a slight negative impact. Pick your battles. If it’s not that important, don’t worry about.
All the changes in the end increased the conversion 28.4 percent.
Example: WebEx
They tested the nav bar. Removing it and it had a slight negative impact. A quote added to the side also had a slight negative impact.
However, combined they had a positive impact. This is why you need to test elements in interaction.
Changing the button from “start now” to “continue: was better in this case. You need to align your expectations. If the visitor thinks he should be finished, “continue” won’t work.
Removing privacy language actually reduced response. “Your privacy is assured” can add in doubt if they already know your brand. If they don’t know your brand, then the opposite can happen.
Overall the changes led to a 63.65 percent increase in conversions.
Continuously test. Don’t test in isolation. Be adventurous and always look for ways to improve.
Sandra Niehaus wraps it up for us by diving into tracking phone calls.
Example: ifbyphone (a “telephone building block company”)
It was not a pretty picture when they started. [Accompanied by an image of an overweight man in swim trunk — not, in fact, pretty]
The problems:
- They had trouble attributing calls to a marketing source, complex offering, wide variety of audiences.
- They weren’t happy with the previous company that had managed their PPC campaign.
- The campaign didn’t have specialized landing page.
They wanted to increase the quantity and quality of:
- Phone calls
- Click to call
- Online conversions (distant third)
They had to get the right audience first. Then they did a landing page redesign using A/B testing to prove that they could in fact make a difference. Then they did multivariate testing based on the winner of the A/B test.
Step one: Audit your PPC — are you attracting the audience you want?
Problem: They were relying too much on broad match instead of the terms that they’d chosen. “Call tracking” permutated to “cell phone tracker” and “gps phone tracker”. “ACD” permutated to “AC/DC”, which was really not the right audience and message. Fixing the PPC campaign reduced spend and increased qualified leads.
Step two: Landing page testing
Problem: There was way too much happening on the page. Two different phone numbers on the page made it hard to track attribution. There was no clear value proposition. It was a very, very long page with a call to action below the fold.
They redesigned to a simpler design with a clearer value proposition, an obvious call to action and one contact number instead of many. The page was still long but it was more obviously grouped. The call to action was repeated so that you never lost it as you scrolled.
Result: They increased the average call time from 4.75 minutes to 9 minutes which convinced the board.
Step three: Multivariate testing
Element A: Heading with value proposition
Element B: Call to action with big changes
Element C: Added another value proposition
Results over all were 3x improvement in call volume and 2.3x in lead quality with no increased spend.
The biggest jump was getting the right audience. Second was the A/B test, then the multivariate testing.
Gord jumps in to say you need to figure out what to test and suggests talking to your customers.
Q&A
How did you guys decide what to test?
Jeremy: He says his company is all about looking at your competitors and discovering how to capitalize on your competitors. It’s complementary to what Dan and Sandra do. We all have great ideas and lots of other people have great ideas too so it’s about capitalizing on it.
Dan: He agrees and says that it’s also about not throwing out ideas during the process, in meetings and such. Look at non-traditional sources. Qwest looked at Amazon for lessons.
Sandra: She suggests reading market research from Enquiro, MarketingSherpa, etc. Learn the elements that have the most impact. Usually it’s the most basic stuff. Understand basic user behavior.
Gord: The bane of the eyetracking studies he does is that now he automatically does heat maps. Hee.
A/B versus Multivariate: where do they fit in?
Sandra: She likes to start there when the pages are horrible and the client is new or inexperienced with testing. She says she want to move people up to the next level using a complete redesign.
Dan: Concept testing is good with A/B testing. “Big idea” testing includes what wild and crazy ideas can we try?
Jeremy: Start simple and get more complicated.
When you’re looking at tools to get into the testing cycle, what do you use?
Sandra: It depends on budget and familiarity with the tools. Sometimes the easiest way is to use paid search ads to test different creatives which then can be easeed into testing pages.
Dan: Sometimes the platform isn’t a good fit. If you’re just getting started, look at Google, look at some of the other folks. There are a number of different solutions at a number of different price points.
Jeremy: There are products and services that allow you to look at that competitive market. His tool is something you use once you’re comfortable with optimization products and you use it in tandem with those.
What kind of budgets do you need for testing?
Jeremy: In terms of competitive analysis, it can range from $10k to $100k for ongoing client engagement. It’s on the higher side for his services.
Dan: He thinks you need to look at this in terms of the value that you’re losing. He can tell you what it costs to work with his company but you need to look at it in relative terms. From a few thousand per month to $30k per month depending on what you want to do. Put it in perspective of the value that you’re losing.
Sandra: Do not look only at the immediate customer loss but also at the lifetime value of the customer. She comes from landing page design services, using Google mostly but sometimes others. The budget is usually about $10k. Don’t skimp on the design of the landing page. The skill of the designer has a lot to do with your ultimate success. They will just naturally produce a better result for you.
Jeremy: Just to add in, these are big impacts to the bottom line. This is very much low hanging fruit.
Do you do segmentation?
Jeremy: Yes, he says he can do behavioral segmentation. That’s actually fairly common. Are you converting with and reaching the right people?
Dan: It’s definitely about the best page for that site visitor, not just the best page period. He definitely looks at behavior, demographics, etc., and uses that to figure out who responded to what.
Sandra: It is also a consideration for them. It’s very important to be targeted.
How do you segment during a test? How much traffic do you send to test?
Dan: That depends on how much traffic you have. It needs to be enough.
Jeremy: That’s why his company looks at the competitive research to see what works for someone else without risking his client’s own revenue.
One Reply to “Landing Pages & Multivariate Testing: 2009 Updates”
The coverage over at Outspoken Media may be a bit long for some people, so I’ll give Bruce Clay readers the 30 second summary simply because I like you guys so much.
Ready?
Us: What does SearchWiki do?
Google: I can’t really answer that.
Us: Are you using it for rankings?
Google: I don’t feel comfortable commenting on that.
Us: Where did the idea for SearchWiki come from?
Google: Answering that would only lead to more conspiracy theories. So I won’t.
Us: You know we’re going to lock you in a closet after this session, right?
Google: I don’t really think I can answer that.
[audience pummels speakers]
Good choice on NOT attending that one, Susan.