Discrepancies arise from the differences between two systems and mistakes that are made in implementation. Identifying discrepancies, is helpful
The biggest cause of discrepancies between two analytics system, is the users that leave before a page has finished, and one analytics system has loaded but the other has not yet. Discrepancies of a few percentage are reasonable. If there is a large discrepancy (like 10% or more) you need to investigate.
If you are looking for information on a particular type of discrepancy, these posts may be helpful:
- What causes discrepancies between two analytics systems?
- Discrepancy between clicks and page lands
- Link clicks vs landing page views big discrepancy
- Discrepancy between landing page sessions and email campaign button clicks
Platform specific discrepancies
- Why is there a discrepancy in Facebook landing page views and Google Analytics report
- What could cause a discrepancy between Google Analytics and Adwords for the same website
- Why is there a WordPress discrepancy between Jetpack stats and Google Analytics
- Why is there a huge discrepancy between Google and Twitter Analytics?
- Why is there so much discrepancy between Google Analytics reports and DFP reports
- Why is there a huge discrepancy between Adwords and Google Analytics reports?
- Why is there discrepancy between Facebook Ads and Google Analytics?
- How do I explain the data discrepancy between Google Analytics and Adobe
- What is a normal discrepancy between Linkedin Ads and Google Analytics
- Why is there a discrepancy between Floodlight reports and attribution data on Google Analytics?
What is a normal discrepancy?
When analyzing discrepancies, scale is important. On low numbers of traffic, a few people leaving can reflect a larger portion of the sample size and distort the data. So it is important to collect the biggest sample size you can, several hundred or at least 100.
Discrepancies of 2-3% over large samples is fairly normal. Discrepancies of over 10% begin to get concerning and should get further investigation.
What causes discrepancies?
The most common reasons for discrepancies can include:
- Accidental clicks
- People hit stop before the page has loaded, or close the tab
- People load the page but the analytics didnโt load before they left
- Too many scripts loaded by the destination page
- A reasonable click discrepancy is >15%. If it is a lot higher you may want to adjust your campaign, either the settings, the creative or the destination page.
High discrepancies can be caused by:
- Slow page speed, make sure your page is fast to load.
- Mobile clicks, users tend to cancel more, experience slower network and/or leave before the page loads.
How to solve a discrepancy
To solve it, you must identify the root cause. Complete these steps to help uncover the exact cause of the discrepancy you are facing.
1. Create a spreadsheet
Create a spreadsheet, and load into it data from the two systems.
Break down the data into:
– Multiple 24 hour ranges
– A week range
– Breakout by device
This should help see where discrepancies are happening.
2. Try a controlled experiment
If you are able to yourself, click through, and see the result. This can help uncover issues. Whether links are broken, pages are slow to load, or experiences cause a bounce.
Another solution can be to buy some extra traffic, and to see how that arises.
3. Check timezones
Check that the timezones you are comparing, are the same.
4. Check metric definitions on each platform
Each platform may measure things differently, which can create meaningful differences. It may be you need to compare different metrics than you originally thought.
For example, Nudge measures People, which is similar to Unique Visitors but is more diligent in de-duping. This creates disparities between the two numbers. As such, comparing on impressions vs page views, gives a better metric for comparison.
5. Check Google Pagespeed
Slow page loads, either on mobile, or in general, can greatly impact discrepancies at the margin.
It may be, that you yourself load the analytics fine. But when you have a large population of people, some leave the experience before analytics are loaded, some browsers are incompatible.
6. Check where scripts are loading
If you are comparing two scripts on the same page, make sure they are loading close to one another in the code. If they are loading far apart, this itself will create discrepancies.
Also check that each script is loading correctly. And only loading once. Often problems can occur when a script is being loaded twice (inadvertently) on page and through a tag manager.
7. Check adblocking
For certain audiences, adblocking may be more prolific, and your tag manager, or analytics provider may not be loaded by those audiences.
8. Check traffic source parameters
It may be that some traffic isn’t being identified due to traffic source parameters. Double check these.
Why do people worry about discrepancies?
Discrepancies cause concern because it could be that there is underlying problem that needs to be addressed. For example, if there is a high click drop off, this means the brand has to spend more money to get the right amount of customers to their store.
Solving the issue, then helps the brand get more customers overall. So thatโs what causes concern, is there a discrepancy here or an issue that is costing us money.
High click discrepancies can be caused by fraud or low quality traffic
The other reason brands are concerned about discrepancies, is that it could indicate that the traffic is low quality or even fraudulent.
This is where other metrics such as Attention, Bounce & Scroll can help uncover if the traffic quality is low. These post-click metrics will quickly identify whether users were intending to click, or had an intention to engage further.
Low quality traffic can make it seem like there is a discrepancy but in reality, the traffic isn’t even loading the destination page, or hits the back button before they arrive. That’s what makes it worthwhile to investigate discrepancies.
Explore more on how to identify fraud in your traffic.
|
---|