Conversions - Digging into your Results Part II

Available for the following Plan types:

FullStory Enterprise

FullStory Advanced

FullStory Business

FullStory for Mobile Apps

FullStory Free

Available to the following User roles:






Learn all about conversions in our interactive learning module! Check it out here.


This is Part 5/5 of a series explaining Conversions.


In the previous article, we explained how to dig deeper into the details of an opportunity. In this article, we're providing a technical addendum that provides full transparency into the calculations behind the Conversions magic. It’s especially relevant to data scientists, analysts, and the technically inclined.

Three Ways we find Conversion Impacts for you

In order to maximize insights with Conversions, it’s important to understand the special cases that arise when we automatically compare the performance of an “affected” and “unaffected” group through a funnel, for a given Signal. Behind the scenes, we actually compute different versions of the impact formula and show you only the most relevant one that applies. 

Note: When interpreting your funnel conversion rates between affected and unaffected groups, be aware that the conversion impact of an experience may not occur at exactly the same step of the funnel as the experience itself. There may be delayed effects especially if experiences such as javascript errors can accumulate or cause downstream knock-on effects for the user.

Here are the 3 approaches that we take to detect a significant conversion impact for a given funnel. Conversions will first attempt to find a significant impact from the first approach (“end to end”) and will present that impact if it succeeds. Otherwise, it will try the second approach, and so on. Conversions will show the result from the first approach that succeeds.

1. End to End of the Funnel

In the simplest case, your affected and unaffected groups perform like two textbook funnels: every step shows some dropoff, and the end-to-end conversion rate of the affected group is low enough compared to the unaffected group to be considered statistically significant. For this, the horizontal histogram in the analysis summary area (see the image below) will indicate numbers for the first and last steps of the funnel, and there will be a pink dotted box around the whole funnel in the bigger chart below.

Screen Shot 2022-07-08 at 9.05.54 AM.png


2. Mid to End of the Funnel

This “mid to end” case was actually highlighted in our FruitShoppe store examples from the earlier articles of this Conversions series.

Screen Shot 2022-07-08 at 9.04.54 AM.png

Notice that the pink dotted box in the Analysis View is only around the last two steps, while the first two are grayed out. That happens when we detect that a given Signal (in this example, an error click on a specific button) doesn't even occur until mid-way through the funnel.

Here, we've calculated the conversion impact from the first step highlighted by the pink box to the end. We do this to ensure that impacts from that Signal can be brought to your attention. If we only calculated the impact end-to-end (from Step 1 to 4 in this example), we might miss significant differences in conversion rate that only apply after the event has happened. 

Note: normally, the first highlighted step is the step on which the Signal occurs. Whether you can be sure of this for your funnel depends on what type of Signal you are analyzing and how much you know about its origin.

For instance, it’s clear that our dead click on a specific page must have occurred precisely at step 4 of the FruitShoppe funnel given the element involved. We wouldn’t expect to see attrition from the affected group at steps prior to visiting that page. However, you’ll notice that the affected group does not maintain 100% of its size on earlier steps! Many digital experiences involve users looping back in an app’s workflow, like returning to product pages after reviewing cart contents. Our attempts to visualize user flows linearly through a funnel don’t fully correspond to the complex reality. So, it’s reasonable to expect that some FruitShoppe users would have experienced the dead click and returned to earlier steps in the funnel, where they may have then dropped out. Conversions uses a heuristic to detect this and will ignore small early dropoffs of this kind.

In contrast, you might be analyzing a generic javascript error that could potentially occur anywhere along the funnel. This error might occur on one or many steps and may or may not even be caused by a user’s action. In this case, you might see a strong dropoff between two steps as well as smaller dropoffs elsewhere in the funnel. The pattern might look the same as for the dead click case above, the only difference being the context with which you interpret it. 

3. Step to Step within the Funnel

There are times when a more granular approach is needed to uncover useful insights. Even if you’ve thought about step-to-step transition rates before (which are sometimes known as micro-conversions), comparing these types of transitions between affected and unaffected groups is complex to interpret and may be unfamiliar to you.

Let’s say your desktop users typically convert more frequently than mobile users, but that there’s an impactful bug on the desktop version of your website that affects everyone in that cohort and reduces their chance of converting. Suppose the bug doesn’t impact desktop conversion enough to make it overall below that of the mobile group, otherwise Conversions could surface the issue in one of the previous approaches.

From an end-to-end viewpoint over all your users, the affected group (in this case, Desktop users) is still converting better overall than the unaffected group (Mobile users), so there would be nothing to report using the previous two approaches. But if there’s a significant loss of customers between two steps for Desktop users compared to the loss on the same steps for Mobile users, then Conversions might tell you that you could be converting even more desktop users if you addressed this bug. 

Note: We can’t guarantee that Conversions will catch every possible impact in a stepwise approach because it depends on the exact pattern of micro-conversion rates in the unaffected group’s funnel.


The stepwise type of opportunity is not easy to detect with tools that only analyze one cohort’s funnel activity at a time. Conversions always analyzes two related cohorts at the same time, which makes this comparison possible. We have seen our customers obtain valuable insights from stepwise impacts more frequently than we had initially expected.

Screen Shot 2022-07-08 at 9.06.42 AM.png

In the unlikely situation that we find multiple stepwise impacts for one Signal in the same funnel analysis, we only surface the one with the highest lost conversions.


Need to get in touch with us?

The FullStory Team awaits your every question.

Ask the Community Technical Support