Conversions - An overview of your results

This is Part 3/5 of a series explaining “Conversions,” which is available for users who have purchased the Enterprise Edition.

 

You’ve set up your Conversion Analysis, chosen the relevant signals you want to consider, and you’re seeing results in the Opportunities table which may look something like the picture below. What does it all mean?

image16_copy.png

When in doubt, we recommend hovering your mouse over numbers and headers to activate the explanations built into the table. We’ll show some examples of the tooltips throughout this section while we focus on exploring a real issue with our checkout funnel.

 

Article_-_14.png

An overview of your Conversions results

What do my results mean?

Conversions compares two groups of users for each user experience signal: those who did have the experience (“affected”) and those who did not (“unaffected”). In the following sections, let’s focus on one specific signal, a dead click on an element that has the text “Your F.S. Order.”

When the affected group is sufficiently different from the unaffected group, we highlight the signal in the Opportunities table. We share more details about how we define this later on.

Hovering over the numbers in the table will give you more information about the two groups. In this case, there are 2,057 user sessions that experienced the dead click, which is about 47% of the total cohort of 4,347. That means there are 2,290 unaffected user sessions. You can also see more about these numbers by clicking into the Opportunity Analysis view for this result.

Screen_Shot_2019-08-19_at_2.44.11_PM.png

By default, we prioritize the list based on confidence level. You can also choose to order by number affected or conversion impact. The conversion impact is calculated as the relative change between the unaffected and affected groups’ conversion rates (see Part 5 for more details about this). The unaffected group converted at 47.16%, and the affected group who experienced the dead click only converted at 9.72%. That’s an 79.38% drop in conversion rate!

Screen_Shot_2019-08-19_at_2.45.18_PM.png

So, if users in those 2,057 affected sessions had actually converted at the unaffected rate of 47.16% instead of 9.72%, we calculate that 770 more transactions might have been completed.

Screen_Shot_2019-08-19_at_2.46.03_PM.png

What should I do if I have zero or very few results?

“I don’t think that my website is perfect, since my users aren’t converting at 100%...but why didn’t FullStory surface any results for my funnel?”

Screen_Shot_2019-08-19_at_2.46.50_PM.png

There are a few different reasons your Conversion Funnel may not be returning results, so let’s go through some ideas for troubleshooting.

Review your Conversion Funnel

Make sure that you’re looking at a funnel where a non-zero number of users are making it through to the final step. As mentioned in the second article, funnels where there are zero successful conversions will not lead to any results being shown.

Screen_Shot_2019-08-19_at_2.47.33_PM.png

Narrow or widen the date range

As you build the Conversion Funnel, we show you a preview of the funnel behavior over the past 30 days. If your preview looked good, but running your Conversions analysis over a different date range failed to show results, you may want to adjust the date range.

If you’re looking at “Past Month” and not seeing good results, consider instead looking at “Past Week”. This could help highlight issues that only started very recently (e.g. when a new release went out) that might otherwise be obscured by three weeks of normal behavior.

On the other hand, if you’re not seeing results for a short date range, there may not be sufficient user sessions to analyze. So, you can try widening your date range. Keep in mind, if you select the custom date range, the range must be 30 days or less. 

Adjust your Conversion Funnel definition

If the number of users at your very first step in the funnel is much larger than all the others, this could be due to “tire kickers.” For instance, setting your first funnel step to be your website homepage can be tempting, but many people are likely to land and browse without intending to complete your funnel. Their high bounce rate on that first page can dilute more actionable signals about friction coming from further downstream in the customer experience.

Screen_Shot_2019-08-21_at_2.49.48_PM.png

To avoid tire kickers weakening your analysis, you can either consider defining a shorter funnel or you can add dependent criteria to the first step, such as only considering users who were on the first page for at least 2 seconds.

Adjust the population you’re considering

If you’ve narrowed in on a small group like mobile users and see no results, try switching to a larger audience such as “Everyone.”

You can also think about a specific group likely to have problems: maybe a group coming in from a specific marketing campaign, or a particular device. Narrowing in on another population might help to surface some new results.

To create a new cohort to analyze, you’ll need to create and save a new Segment. Navigate to search in the main FullStory Segments view, search for the cohort of users you want, and save that segment. You’ll then be able to select that segment as an option under “Performed By” in the Conversions Composer.

Article_-_21.png

Consider more signals

Remember that, out of the box, FullStory is analyzing four potential signals for your Conversion Funnel: performance issues, Rage Clicks, Dead Clicks, and Error Clicks. Perhaps for your funnel those signals aren’t yielding useful results. If so, you can try adding more signals like Custom Events or Watched Elements to get more insights!

  • Custom Events instrumented for error/friction moments are particularly useful. If you can fire a Custom Event every time a user encounters a JavaScript error and include the message as an event property you’ll be able to analyze which JavaScript errors are preventing users from completing the funnel. Conversions supports string and boolean property types. 
  • Perhaps watch a session where a user encounters a pop up error dialog. Put a “Watched Element” on the CSS selector for the error dialog, and then in the future you’ll be able to see how that error dialog affects conversion rates.

Confidence Level

Nobody knows your product and your business better than you. This is why FullStory surfaces all potential opportunities to you, regardless of their level of confidence. This allows you the flexibility to pinpoint problematic issues before they have a chance to grow and scale across a larger set of your end-users.

What is significant?

There are a couple ways to think about which results returned the opportunity table may be significant:

  1. Opportunities are initially returned sorted by Confidence Level, which is determined by our statistical method. It is good practice to treat confidence above 90% or even 95% as statistically significant . Opportunities with lower confidence are more likely to be false positives due to chance, for which our estimates of lost conversions will be less reliable.
  2. Looking at opportunities sorted by Affected User count can also be a good way to determine significance. If an affected user count makes up a large fraction of the users traveling through a given funnel (think 90% or more) then that is a good indication that an issue is widespread and should be addressed.
  3. The third, and probably best, indication of significance would be an opportunity affecting a high percentage of the users that also has a high confidence level.

It can be a good practice to pay attention to the opportunities that are currently not significant, especially the ones with low affected user counts, to see if these opportunities grow in significance over time.

 

image16_copy.png

Next, we will explore how you can dig in to analyze your results further and watch sessions.

Need to get in touch with us?

The FullStory Team awaits your every question.

Contact us