From Minimum Viable Product to Landing Pages

A lot (okay, a ton) has already been written on the landing page design. One of the most important elements of a landing page is the unique value proposition (UVP). A headline, image, or tagline needs to engage the visitor in the first 5 seconds. The “experts” agree that a great UVP can more than compensate for getting everything else wrong on the page. Since CloudFire is primarily distributed through our website, getting the UVP right sounded like the next MVP (maximum learning for minimum effort) to tackle. My strategy was to

  • Start with a simple layout for the landing page.
  • While keeping all other elements the same, vary the UVP
  • Measure engagement
  • Rinse and repeat

Version 1: A Starting Point

While testing my MVP, I built a basic landing (and pricing) page in parallel to show potential customers at the end of the interview. CloudFire’s UVP was predicated on the problem that “Sharing lots of photos and videos is a hassle.” Hence, my initial reaction was to develop a quantifiable statement that could be verified. I started with the following:

The initial reaction from customers was that they didn’t view the value proposition of CloudFire as “performance-based” but “convenience based.” The 5-minute promise was also a little vague.

This version did not explicitly call out “who” the product was for, which was part sub-conscious and by design. We all want to build mainstream products but trying to act mainstream from the start is a mistake. You can’t afford to be all things to all people.

Version 2: The Emotional Hook

I then turned to Chance Barnett’s rule on leading with “finished story” benefits and tried connecting at a more emotional level:

This version did get nods of approval from interviewees. It hit the “sharing is a hassle” point and called “parents” out explicitly.

Of course, testing this way was very skewed as I had the benefit of first pitching and demoing the customer in person. I decided it was time to drive (buy) a little traffic and test. I was concerned that everyone we interviewed had found their existing photo-sharing solution through a friend referral and not through Google. I decided to test StumbleUpon, Facebook, and Adwords anyway.

No lean testing approach would be complete without some A/B split testing, so I came up with three variants which I set up using Google Website Optimizer. I thought the competition would be fierce in an existing market such as Photo Sharing, so at this stage, I decided to measure engagement simply as the visitor reached the pricing page. This also sped up testing iterations which are key to learning. I did track signups but wasn’t ready to optimize the signup process just yet.

Variation 1: Simple Way

Variation 2: Dead-simple

Variation 3: Fastest Way

Testing Approach 1 — StumbleUpon: A New Alternative

I had read a post on the lean startup circle group about StumbleUpon advertising being a much cheaper alternative ($0.05 per visit) to Adwords and decided to start there. For those that don’t know their model, StumbleUpon suggests pages to users that have expressed interest in a particular topic. Users rate stumbled pages and can recommend them to others. You can select some demographics (sex, age), but the categories are pretty broad. I tested both Male/Female and Female only, ages 25–45, under Family and Photography categories. I signed up for $5/day, which drove 100 visits per day. While StumbleUpon had no problems driving visits, the bounce rate was 100%.

I liken StumbleUpon to a TV remote. People are clicking for pages with some entertainment value, and like TV, engagement for other pages is pretty low. It didn’t take long to realize it was time to move to another test channel.

Testing Approach 2 — Facebook: Where Customers Hangout

Facebook was appealing because many of our customers already had their social networks on it and used it regularly. However, with Facebook, I had the reverse issue from StumbleUpon. I didn’t get enough visitors per day. The suggested CPCs were high, and our CTR was low. I had heard of similar low-performing reports from others, and rather than tweaking the ad copy on Facebook, I decided to test against the elephant in the room — Adwords.

Testing Approach 3 — Adwords: The Elephant in the Room

A ton (maybe too much) has also been written about creating and testing PPC campaigns. I used these techniques to group and test keywords and ad copy variations. The more general keywords were highly competitive and seemed to get even more expensive from one day to the next. It was good to learn I wasn’t alone. With highly niched keywords (parent targeted), the competition was moderate, but the search volume was too low. It wasn’t even a question of advertising budget — Not enough people were searching for “Sharing baby photos” to drive any meaningful search traffic. Interestingly enough, most of the click-throughs were coming from Google’s content placement channels, like YouTube, which were expensive ($2–5) and with low engagement (85% bounce rate).

This validated my earlier finding that SEM might not actually be a viable distribution channel for CloudFire.

Social proof seems king when it comes to reaching and connecting with parents. Influence-based channels like blogs, social media, and viral loops are probably the distribution channels that will work here.

The biggest frustration, however, was dealing with the lack of learning. There wasn’t enough traffic to make the A/B split tests statistically meaningful. These weren’t cheap click-throughs, and I had no visibility into why people bounced. The message wasn’t getting through, but why…

Testing Approach 4 — UserTesting.com: A Breakthrough

It was time to turn to some usability testing minus the pitch. I had some experience with face-to-face usability testing with BoxCloud. They can be done fairly cheaply but require time to find testers, script the test, and then conduct the test. Cindy Alvarez has a great presentation on User Testing Tactics. While gearing up for this, I ran across UserTesting.com, a way to run usability testing over the web for $29 per 15-minute test. The test session is recorded with a screencast (audio + video). The only thing you miss out on is body language. But the advantage of not having to conduct the test session made it a no-brainer to try.

While $29 per test might sound like a lot, it costs more to run face-to-face usability testing — gift certificates, coffee, and your time. More importantly, “yet more experts” have shown that you need five testers to uncover 85% of the problems. I spent as much on Adwords a week with insufficient learning, so it was worth A/B testing both approaches.

With UserTesting.com, you can be highly selective on the target demographic of the tester. No checkboxes, just plain text. Mine read:

***YOU MUST BE A PARENT THAT CURRENTLY SHARES FAMILY PHOTOS AND/OR VIDEOS ONLINE TO ACCEPT THIS TASK***

Then you give the tester a script to follow. I asked the tester to view the page for 5 seconds and recount what they remembered. Then asked them to explain the service and highlight what was different about it and if the difference mattered to them. In other words: Can people understand the UVP from the landing page, and is it a Unique Value Proposition?

After the first three tests, it was obvious that people did not get the UVP. They viewed CloudFire as yet another photo/video sharing service that promised no hassle and ease of use — which were just empty words to them. CloudFire’s most unique feature was “no uploading required,” but I had purposely kept the headline benefits based on leaving the product details to the video. Surely people would want to watch the 2-minute tour. Only one person clicked on that link, and she only watched it halfway before concluding she had seen enough. Not one person could see how CloudFire was different from their existing service.

Version 3: The Video Alternative

Since people weren’t clicking the video, I decided to supplement the headline with more descriptive text and replace the image area on the right with a slideshow that cycled through highlights of the video on a set interval.

This version started getting better UVP comprehension, but people found the slideshow too busy, so I reverted to the screenshot and came up with alternate headlines.

Version 4: Alternate Headlines

The big change here was stating the UVP as a unique feature since the benefit was not resonating. The usability tests had shown that the term “uploading” was very much part of our customers' everyday vocabulary. So much so that they automatically assumed every photo-sharing service required it. I decided to lead by challenging that assumption.

I found the word “instant” didn’t register with people. I think it’s because “instant” is one of those abused marketing terms that isn’t always equated to mean “immediately” or “in zero seconds.” I remember seeing a Google ad that read: “Instant Website — Up and running in 30 minutes”.

This version did much better than the last in getting the “no uploading required” point across. Some testers saw this difference immediately, others realized it at some point during the test window. A few didn’t see it, no matter how much time they spent on the page.

Version 5: Busy Parents

So I made another change where I put a “No uploading required” starburst on the image itself, which seemed to do the trick. Almost every tester was immediately reporting “No uploading required” and was curious as to what that meant.

Curiosity is the first step toward Engagement

Awesome!

This version is fast-forwarded to a few iterations. By then, I had also split-tested a new headline: “Photo and Video Sharing for Busy Parents,” which answered the “what” and “who” of the product as well as used the words “busy parents” to connect at a more emotional level than just “parents.” It also showed a different call to action button from “Download Now” to “Try us for Free.” The point was not to hide that CloudFire is downloaded software but to test if that affected engagement. I’ll go into more on optimizing the signup process later.

Takeaway

The most important takeaway was realizing that marketing optimization was not like code optimization. It’s much harder to correlate causality from raw data, and there is no substitute for talking to real people. So as Steve Blank loves to say: “Get out of the building.”

Update: If you liked this content consider checking out my book: Running Lean, which details step-by-step instructions for building, testing, and measuring landing pages for your MVP.

You can learn more here: Get Running Lean.

You've successfully subscribed to LEANSTACK Blog
Great! Next, complete checkout to get full access to all premium content.
Error! Could not sign up. invalid link.
Welcome back! You've successfully signed in.
Error! Could not sign in. Please try again.
Success! Your account is fully activated, you now have access to all content.
Error! Stripe checkout failed.
Success! Your billing info is updated.
Error! Billing info update failed.