Study: How In-Platform Reporting Compares to Real Results

Attribution is vital to digital marketing for so many reasons: It proves success, helps pinpoint problem areas, and informs future spending decisions. Without a verified source of truth, you might be flying blind with your marketing efforts.

Another reason to seek out good attribution solutions is that they’ll often give you data that’s more granular and accurate than what the marketing platforms themselves report. Because services like Meta Ads and Google Ads want to encourage you to invest ad spend into them, they may use more generous counting windows that allow them to take more credit for clicks and conversions. On the other hand, because these platforms don’t have access to the full breadth of your marketing program or website actions, they may in some cases undercount the influence of other factors.

Recently, ADM worked with leading attribution platform Measured to look into this issue on behalf of one of our clients, johnnie-O. We had noted that Meta Ads’ in-platform metrics seemed to overestimate its impact, using a 7-day click/1-day view attribution method that appeared to exaggerate results. Conversely, Google Analytics’ last-click metrics failed to consider the additional influence of exposure and sentiment from Meta Ads, resulting in an undervaluation of their effectiveness.

In a new joint case study between ADM, Measured, and johnnie-O, we were able to quantify this phenomenon and determine the degree to which Meta was overvaluing and Google Analytics were undervaluing the return on ad spend (ROAS) that johnnie-O’s Meta campaigns were delivering. In the case study below, we explain the methodology used to reach these conclusions and the degree to which both in-platform services were inaccurate—and the findings may surprise you.