A lot (okay a ton) has been written already on landing page design. A great starting point is Chance Barnett’s Landing Pages That Convert. One of the most important elements of a landing page is the unique value proposition (UVP). It’s a headline, image, or tagline that needs to engage the visitor in the first 5 seconds. The “experts” agree that a great UVP can more than compensate for getting everything else wrong on the page. Since CloudFire is primarily distributed through our website, getting the UVP right sounded like the next MVP (maximum learning for minimum effort) to tackle. My strategy was to
- Start with a simple layout for the landing page
- While keeping all other elements the same, just vary the UVP
- Measure engagement
- Rinse and repeat
Version 1: A Starting Point
While testing my MVP, I started building a basic landing (and pricing) page in parallel that I would show potential customers at the end of the interview. CloudFire’s UVP was predicated on the problem that “Sharing lots of photos and videos is a hassle” so my initial reaction was to come up with a quantifiable statement that could be verified. I started with:
The initial reaction from customers was that they didn’t view the value proposition of CloudFire as “performance based” but “convenience based”. The 5 minute promise was also a little vague.
This version did also not explicitly call out “who” the product was for, which was part sub-conscious and part by design. We all want to build mainstream products but trying to act mainstream from the start is a mistake. You can’t afford to be all things to all people.
Version 2: The Emotional Hook
I then turned to Chance Barnett’s rule on leading with “finished story” benefits and tried connecting at a more emotional level:
This version did actually get nods of approval from interviewees. It hit the “sharing is a hassle” point and called “parents” out explicitly.
Of course, testing this way was very skewed as I had the benefit of first pitching and demoing the customer in person. I decided it was time to drive (buy) a little traffic and test. A concern I had was that everyone we interviewed had found their existing photo sharing solution through a friend referral and not through Google. I decided to test StumbleUpon, Facebook, and Adwords anyway.
No lean testing approach would be complete without some A/B split testing, so I came up with 3 variants which I set up using Google Website Optimizer. I thought competition would be fierce in an existing market such as Photo Sharing, so at this stage, I decided to measure engagement simply as the visitor reaching the pricing page. This also sped up testing iterations which is key to learning. I did track signups but wasn’t ready to optimize the signup process just yet.
Variation 1: Simple Way
Variation 2: Dead-simple
Variation 3: Fastest Way
Testing Approach 1 — StumbleUpon: A New Alternative
I had read a post on the lean startup circle group about StumbleUpon advertising being a much cheaper alternative ($0.05 per visit) to Adwords and decided to start there. For those that don’t know their model, StumbleUpon suggests pages to users that have expressed interest in a particular topic. Users rate stumbled pages and can recommend them to others. You can select some demographics (sex, age) but the categories are pretty broad. I tested both Male/Female and Female only, ages 25–45, under Family and Photography categories. I signed up for $5/day which drove 100 visits per day. While StumbleUpon had no problems driving visits, the bounce rate was 100%.
I liken StumbleUpon to a TV remote. People are clicking for pages with some entertainment value and like TV, engagement for other pages is pretty low. It didn’t take long to realize it was time to move to another test channel.
Testing Approach 2 — Facebook: Where Customers Hangout
Facebook was appealing because a lot of our customers already had their social networks on Facebook and used it quite regularly. However, with Facebook, I had the reverse issue from StumbleUpon. I didn’t get enough visitors per day. The suggested CPCs were high and our CTR really low. I had heard of similar low performing reports from others and rather than tweaking the ad copy on Facebook, I decided to test against the elephant in the room — Adwords.
Testing Approach 3 — Adwords: The Elephant in the Room
A ton (maybe too much) has also been written about creating and testing PPC campaigns. I used some of these techniques to group and test keywords and ad copy variations. The more general keywords were highly competitive and seemed to get even more expensive from one day to the next. It was good to learn I wasn’t alone. With highly niched keywords (parent targeted), the competition was moderate, but the search volume was too low. It wasn’t even a question of advertising budget — Simply not enough people were searching for “Sharing baby photos” to drive any meaningful search traffic. Interestingly enough most of the click-throughs were coming from Google’s content placement channels like YouTube which were expensive ($2–5) and with low engagement (85% bounce rate).
This validated my earlier finding that SEM might not actually be a viable distribution channel for CloudFire.
Social proof seems to be king when it comes to reaching and connecting with parents. Influence based channels like blogs, social media, and viral loops are probably the distribution channels that will work here.
The biggest frustration, however, was dealing with the lack of learning. There wasn’t enough traffic to even make the A/B split tests statistically meaningful. These weren’t cheap click throughs and I had no visibility into why people were bouncing. The message just wasn’t getting through but why…
Testing Approach 4 — UserTesting.com: A Breakthrough
It was time to turn to some usability testing minus the pitch. I had some experience with face to face usability testing with BoxCloud. They can be done fairly cheaply but require time to find testers, script the test, and then conduct the test. Cindy Alvarez has a great presentation on User Testing Tactics. While gearing up for this, I ran across UserTesting.com which is a way to run usability testing over the web for $29 per 15 minute test. The test session is recorded with a screencast (audio + video). The only thing you really miss out on is body language. But the advantage of not having to conduct the test session made it a no-brainer to try.
While $29 per test might sound like a lot, it costs more to run face to face usability testing — gift certificates, coffee, your time. More importantly though “yet more experts” have shown that all you need are 5 testers to uncover 85% of the problems. I was spending as much on Adwords a week, with not enough learning, so it was worth A/B testing both approaches.
With UserTesting.com, you can be highly selective on the target demographic of the tester. No checkboxes, just plain text. Mine read:
***YOU MUST BE A PARENT THAT CURRENTLY SHARES FAMILY PHOTOS AND/OR VIDEO ONLINE TO ACCEPT THIS TASK***
Then you give the tester a script to follow. I asked the tester to view the page for 5 seconds and recount what they remembered. Then asked them to explain the service and highlight what was different about it and if the difference mattered to them. In other words: Can people understand the UVP from the landing page and is it really a Unique Value Proposition?
After the first 3 tests it was pretty obvious that people did not get the UVP. They viewed CloudFire as yet another photo/video sharing service that promised no hassle and ease of use — which were just empty words to them. CloudFire’s most unique feature was “no uploading required” but I had purposely kept the headline benefits based leaving the product details to the video. Surely people would want to watch the 2 minute tour. Only 1 person clicked on that link and she only watched it half way before concluding she had seen enough. Not one person could see how CloudFire was different from their existing service.
Version 3: The Video Alternative
Since people weren’t clicking the video, I decided to supplement the headline with more descriptive text and replace the image area on the right with a slideshow that cycled through highlights of the video on a set interval.
This version started getting better UVP comprehension but people found the slideshow too busy so I reverted back to the screenshot and came up with some alternate headlines.
Version 4: Alternate Headlines
The big change here was stating the UVP as a unique feature since the benefit was not resonating. The usability tests had shown that the term “uploading” was very much part of our customers everyday vocabulary. So much so that they automatically assumed every photo sharing service required it. I decided to lead by challenging that assumption.
I found the word “instant” didn’t really register with people. I think it’s because “instant” is one of those abused marketing terms that isn’t always equated to mean “immediately” or “in zero seconds”. I remember seeing a Google ad one time that read: “Instant Website — Up and running in 30 minutes”.
That said, this version did a lot better than the last in getting the “no uploading required” point across. Some testers saw this difference immediately, others realized it at some point during the test window. There were a few that just didn’t see it no matter how much time they spent on the page.
Version 5: Busy Parents
So I made one other change where I put a “No uploading required” starburst on the image itself and that seemed to do the trick. Almost every tester was now immediately reporting “No uploading required” and were curious as to what that meant.
Curiosity is the first step towards Engagement
This version is fast-forwarded a few iterations. By then, I had also split-tested a new headline: “Photo and Video Sharing for Busy Parents” which answered the “what” and “who” of the product as well as used the words “busy parents” to connect at a more emotional level than just “parents”. It also showed a different call to action button from “Download Now” to “Try us for Free”. The point was not to hide the fact that CloudFire is downloaded software but to test if that affected engagement. I’ll go into more on optimizing the signup process later.
The most important takeaway for me was realizing that marketing optimization was not like code optimization. It’s much harder to correlate causality from raw data and there is no substitute to talking to real people. So as Steve Blank loves to say: “Get out of the building”.
Update: If you liked this content consider checking out my book: Running Lean which details step-by-step instructions for building, testing, and measuring landing pages for your MVP.
You can learn more here: Get Running Lean.