A First Look at Some Metrics Numbers

Last time I shared my conversion dashboard and promised some numbers. I don’t have all the numbers yet. But enough to start identifying some actionable next steps.

First, some tools discussion is in order.

I have been using Mixpanel and evaluating KISSmetrics for my metrics data. While I still see many possibilities using Mixpanel for application-level metrics (e.g., what features are getting used), I find KISSmetrics a lot more aligned with my goals for conversion metrics.

Here’s why… First, some common goodness between the two: Both are near-real time. Both are event-driven. Both can define custom properties on events like plan type, operating system version, browser version, etc.

But here’s where KISSmetrics shines:

Ad-hoc funnel reports
While Mixpanel and KISSmetrics use events to construct funnel reports, Mixpanel assumes all funnels are linear and fixed. You must pre-define the exact sequence of steps upfront, then hardcode them into your pages.

For example, to record a “Created Gallery” event during a “Signup Flow” for funnel analysis, you would generate an event like this:

mpmetrics.track_funnel(‘Signup Flow’, 3, ‘Created Gallery’);

where 3 identifies the third step in your funnel. Pre-defining funnels like this are fragile. If the event occurs out of order, it isn’t counted. If you need to add another event to uncover more detail, you must touch all the events that follow it in the flow.

KISSmetrics, on the other hand, collects events that can be later assembled into one or more ad-hoc funnel reports.

You would code the same “Created Gallery” event as:

KM.record(“Created Gallery”);

You don’t need to identify a step number or funnel for this event. This sort of late composition of funnel reports decouples raw event data from how they are eventually used in ad-hoc reports, which is incredibly flexible.

What I also found, a little to my surprise is that CloudFire’s conversion funnel is linear but not fixed. People generally move from the top of the funnel toward the bottom, but they can jump in at multiple points. I’ve had users click directly to the signup page from another site without going through the landing or pricing pages. This wouldn’t get recorded in a Mixpanel-type pre-defined funnel since it wouldn’t start at Step 1.

Ability to identify users
Another challenge when collecting metrics is tracking users consistently across sessions. Most analytics tools use unique cookies, but those break down across browsers or multiple computers. CloudFire, a downloaded p2web app, has the additional requirement for multiple domain support. KISSmetrics offers a simple (too simple) solution for tracking users across any of these scenarios.

Before a user’s identity is known, such as pre-sign-up, KISSmetrics tracks users using a unique cookie (like other analytics tools). But once the identity is known, such as post-login, you can call a

KM.identify(“$user_identifier”);

method and tell it exactly who the user is using a persistent user identifier such as username or email. All data pre-login is merged with that post-login into a single record.

This way, events are tied to people and are much more meaningful.

1-Page Reports
I like the 1-page report visualization, which you’ll see later. However, I wish they had made their dashboard a little more useful.

Mine currently looks like this:

You might recognize it as my AARRR conversion dashboard. All it’s missing are conversion percentages next to each funnel.

On to the numbers

As I mentioned last time, I am laying the foundation for measuring all the metrics but only focusing on optimizing Activation and Retention initially. As each of the AARRR metrics is a sub-funnel, I broke them out into separate funnel reports rather than build one giant report that would be a nightmare to maintain in the long run.

The ability to create ad-hoc funnel reports discussed earlier allows me to measure everything at the macro level and add more detail to drill into the sub-funnels as needed for optimization.

Acquisition Report

I define Acquisition (user engagement) as a visitor who doesn’t abandon and visits the pricing page (usually from the landing page). I am collecting KM events on my landing and pricing pages and have created a funnel report that looks like this:

A 35% conversion (or 65% bounce rate) isn’t particularly great, but it’s good enough to drive meaningful traffic to validate my MVP for now (scaling comes after product/market fit). Apart from my earlier adventures in SEM, I have not spent any more money on paid channels and am investing time building up some viable free channels (SEO, blogs) in parallel.

Activation Report

I define Activation (happy first-user experience) as a sub-funnel comprised of the following steps: Sign-up, Download App, and Create First Gallery.

The first two steps occur on the CloudFire website, while the last is done in the downloaded application. Since I started using KISSmetrics, I haven’t finished integrating it with the app and am relying on an earlier custom database report I created to measure “Created First Gallery.”

Here’s what the KISSmetrics Activation funnel looks like without the last step:

You’ll see what I meant by CloudFire’s conversion funnel being linear but non-fixed: 11 new people viewed the sign-up form, and eight new people downloaded the app without starting from the top of the funnel.

Supplementing, as closely as possible, with my custom report for the number of users that successfully “Created First Gallery” lowers the Activation conversion rate from 11.5% to 5.3%.

This is where I currently am on the numbers. I will finish data collection for Retention, Referral, and Revenue this week, but the Activation numbers already reveal several potential hot spots.

Normalized Conversion Dashboard

I like to visualize my conversion funnel as a percentage of total visitors and have normalized the numbers to reflect that. I’ve also blown up the Activation row to show the full Activation sub-funnel since I’m highly motivated to optimize that. As I don’t have the other numbers yet, I’m not bothering with showing them for now.

These numbers immediately indicate that the Activation process is NOT healthy.

While all three steps exhibit leaking buckets, I was particularly concerned with losing more than half the people who downloaded the app but couldn’t successfully finish the first task of creating a gallery. That’s where I decided to start.

It’s not that the others aren’t as important, but they seem more of an optimization problem (pricing, sign-up form, copy, etc.) than a fundamental failing of the MVP (software) itself. Plus, people that made it to the last step successfully navigated the previous steps, so there is some added rationale in starting from the bottom of the funnel.

Diagnosing downloads

Despite having an easy way to contact us (800 number, email, GetSatisfaction) on every page of the sign-up process, most people did not choose to contact us when things went wrong, which placed the burden of figuring out what went wrong on us.

The first step was being able to identify users that downloaded the app. I used to allow users to download the application from the website and complete the signup process from the app. The idea was to reduce friction and make the app self-contained so it could be distributed from other websites (like download.com). However, if a user had a problem with the installation, there was no way of knowing. So I reordered the flow to where users create an account first on the website, then download the app. That way, we have their email address and can contact them if needed.

The second step was identifying as quickly as possible if users ran into an issue. It was easy to construct a report that found users who signed up but didn’t finish creating their first gallery within a reasonable timeframe. I sent them all personalized emails (this has been automated now), and happily, many replied. Some downloaded the installer but didn’t know to run it (how do I fix that?). Others had issues with the installation itself, which they shared. No one, so far, had issues creating a gallery once the app was installed and launched.

Downloaded apps, in general, are a challenge. The desktop, with multiple operating system versions, java versions, anti-virus programs, NATs, and firewalls, is a pretty hostile environment for a new networked application.

One of the issues I uncovered was over a nasty shortcut Apple took with force-migrating everyone from 32-bit Java 5 to 64-bit Java 6 in Snow Leopard using a symbolic link pointing to Java 1.5 -> Java 1.6 (WTF!). This broke CloudFire. Fixing it required upgrading a 3rd party component (Eclipse) which required rewriting the software update process (now using P2/OSGI) and my continuous deployment process (future post). Other issues were 64-bit versus 32-bit on Windows and bad pre-existing Java installations.

I’ve found that, in the end, users WILL encounter unanticipated problems because you can only test so many desktop/browser configurations (until you can afford to run all of them). The key is to be able to identify users that run into problems as quickly as possible and then try to engage them directly with an offer of help, gift certificates, extended trials… whatever it takes to get them to talk to you as they hold the answers (actually they hold the problems, it’s up to you to uncover the answers).

I’ve also started running some more usability tests on the download process, which hasn’t revealed anything as significant as the issues already uncovered, so maybe I’ll start seeing some improvement in those numbers soon.

What’s Next

Completing the rest of the conversion dashboard, prioritizing other areas in Activation/Retention that need addressing, a/b testing, usability testing, and customer follow-up interviews.

You've successfully subscribed to LEANSTACK Blog
Great! Next, complete checkout to get full access to all premium content.
Error! Could not sign up. invalid link.
Welcome back! You've successfully signed in.
Error! Could not sign in. Please try again.
Success! Your account is fully activated, you now have access to all content.
Error! Stripe checkout failed.
Success! Your billing info is updated.
Error! Billing info update failed.