This is Part 3/5 of a series explaining “Conversions,” which is available for users who have purchased the Enterprise Edition.
You’ve set up your Conversion Analysis, chosen the relevant signals you want to consider, and you’re seeing results in the Opportunities table which may look something like the picture below. What does it all mean?
|When in doubt, we recommend hovering your mouse over numbers and headers to activate the explanations built into the table. We’ll show some examples of the tooltips throughout this section while we focus on exploring a real issue with our checkout funnel.|
Conversions compares two groups of users for each user experience signal: those who did have the experience (“affected”) and those who did not (“unaffected”). In the following sections, let’s focus on one specific signal, a dead click on an element that has the text “Your F.S. Order.”
When the affected group is sufficiently different from the unaffected group, we highlight the signal in the Opportunities table. We share more details about how we define this later on.
Hovering over the numbers in the table will give you more information about the two groups. In this case, there are 2,057 user sessions that experienced the dead click, which is about 47% of the total cohort of 4,347. That means there are 2,290 unaffected user sessions. You can also see more about these numbers by clicking into the Opportunity Analysis view for this result.
By default, we prioritize the list based on confidence level. You can also choose to order by number affected or conversion impact. The conversion impact is calculated as the relative change between the unaffected and affected groups’ conversion rates (see Part 5 for more details about this). The unaffected group converted at 47.16%, and the affected group who experienced the dead click only converted at 9.72%. That’s an 79.38% drop in conversion rate!
So, if users in those 2,057 affected sessions had actually converted at the unaffected rate of 47.16% instead of 9.72%, we calculate that 770 more transactions might have been completed.
“I don’t think that my website is perfect, since my users aren’t converting at 100%...but why didn’t FullStory surface any results for my funnel?”
There are a few different reasons your Conversion Funnel may not be returning results, so let’s go through some ideas for troubleshooting.
Make sure that you’re looking at a funnel where a non-zero number of users are making it through to the final step. As mentioned in the second article, funnels where there are zero successful conversions will not lead to any results being shown.
As you build the Conversion Funnel, we show you a preview of the funnel behavior over the past 30 days. If your preview looked good, but running your Conversions analysis over a different date range failed to show results, you may want to adjust the date range.
If you’re looking at “Past Month” and not seeing good results, consider instead looking at “Past Week”. This could help highlight issues that only started very recently (e.g. when a new release went out) that might otherwise be obscured by three weeks of normal behavior.
On the other hand, if you’re not seeing results for a short date range, there may not be sufficient user sessions to analyze. So, you can try widening your date range. Keep in mind, if you select the custom date range, the range must be 30 days or less.
If the number of users at your very first step in the funnel is much larger than all the others, this could be due to “tire kickers.” For instance, setting your first funnel step to be your website homepage can be tempting, but many people are likely to land and browse without intending to complete your funnel. Their high bounce rate on that first page can dilute more actionable signals about friction coming from further downstream in the customer experience.
To avoid tire kickers weakening your analysis, you can either consider defining a shorter funnel or you can add dependent criteria to the first step, such as only considering users who were on the first page for at least 2 seconds.
If you’ve narrowed in on a small group like mobile users and see no results, try switching to a larger audience such as “Everyone.”
You can also think about a specific group likely to have problems: maybe a group coming in from a specific marketing campaign, or a particular device. Narrowing in on another population might help to surface some new results.
To create a new cohort to analyze, you’ll need to create and save a new Segment. Navigate to search in the main FullStory Segments view, search for the cohort of users you want, and save that segment. You’ll then be able to select that segment as an option under “Performed By” in the Conversions Composer.
Remember that, out of the box, FullStory is analyzing four potential signals for your Conversion Funnel: performance issues, Rage Clicks, Dead Clicks, and Error Clicks. Perhaps for your funnel those signals aren’t yielding useful results. If so, you can try adding more signals like Custom Events or Watched Elements to get more insights!
- Perhaps watch a session where a user encounters a pop up error dialog. Put a “Watched Element” on the CSS selector for the error dialog, and then in the future you’ll be able to see how that error dialog affects conversion rates.
Nobody knows your product and your business better than you. This is why FullStory surfaces all potential opportunities to you, regardless of their level of confidence. This allows you the flexibility to pinpoint problematic issues before they have a chance to grow and scale across a larger set of your end-users.
There are a couple ways to think about which results returned the opportunity table may be significant:
- Opportunities are initially returned sorted by Confidence Level, which is determined by our statistical method. It is good practice to treat confidence above 90% or even 95% as statistically significant . Opportunities with lower confidence are more likely to be false positives due to chance, for which our estimates of lost conversions will be less reliable.
- Looking at opportunities sorted by Affected User count can also be a good way to determine significance. If an affected user count makes up a large fraction of the users traveling through a given funnel (think 90% or more) then that is a good indication that an issue is widespread and should be addressed.
- The third, and probably best, indication of significance would be an opportunity affecting a high percentage of the users that also has a high confidence level.
It can be a good practice to pay attention to the opportunities that are currently not significant, especially the ones with low affected user counts, to see if these opportunities grow in significance over time.
Next, we will explore how you can dig in to analyze your results further and watch sessions.