Now That We’ve Improved Media Transparency, Let’s Do the Same With Data

Share this:

Media transparency has been top of mind for marketers over the last few years. As a result, a thriving ecosystem of verification initiatives, vendors, and solutions emerged to measure issues such as fraud and viewability.

The attention given to these issues has worked, too. According to Integral Ad Science, direct publisher spend between 3Q’15 and 2H’17 improved significantly across these metrics (this is the only apples-to-apples comparison between their two reports with the greatest gap between similar periods of the year). Viewability increased from 46.3% to 62.2%, a decrease in waste of 30%. Fraud decreased from 3.2% to 2.1%, a decrease of 34%. Clearly, this improvement has been great for marketers and, in turn, for the industry.

In October 2017, in response to the question about why viewability has increased so substantially, Integral Ad Science answered, “The first major step forward was for the industry to identify there was a viewability challenge.”

Today, there is a different challenge that requires the same level of industry attention: data quality. Data has become an enormous piece of the digital advertising puzzle. Self-service programmatic marketers are likely used to spending just as much on data as they do on media, both with an average range of about $0.50 to $2 CPM. A recent IAB study discovered more than $20 billion was spent by US marketers in 2017 on third-party data alone, and that doesn’t account for the infrastructure to support first-party data. Almost 90% of marketers expect to either increase spend on third-party data or maintain it over the next two years. The proliferation of data stores—such as those available from LiveRamp and Oracle—together with simple UIs that make thousands of segments available at the push of a button, makes it easier than ever for marketers to tap into the data marketplace.

Verification benchmarks prove that data quality is as threatening an issue as non-viewable impressions, and a much larger problem than fraud. According to Nielsen’s most recent DAR benchmarks report through 1Q’18, digital ads that are age- and gender-targeted to an audience 16-30 years in size (which Nielsen defines as “Medium” level targeting, not “Broad” or “Narrow”) result in an average 42% on-target percentage across all digital. To be clear: that’s 58% of ads being served to the wrong age/gender audience.

The Emodo Institute has analyzed more than 100 data set studies and found that, on average, the lat/long location data in bid requests are incorrect about 57% of the time, and 45% of location-targeted audience impressions are inaccurately targeted.

The below chart shows the relative scale of media-related quality issues vs. data-related quality issues (pulled from readily available data). Clearly, data quality is a major issue. It causes significantly more waste than fraud. In addition, it can negatively impact traditional brand metrics like affinity and preference when the wrong messages are delivered to the wrong people.

Sources: IAS Media Quality Reports 3Q’15 and 2H’17 for Direct Publisher Display; Nielsen DAR Benchmarks Reports 1Q’18 for M/F 16-30 years; Ericsson Emodo Verification Benchmarks 2017-18.

So, why does the data industry continue to thrive as a black box, with virtually no transparency or accountability? One reason could be marketers have reached their limit of spending on incremental third-party digital tools, sometimes referred to as the “ad tech tax.” For these marketers, there’s good news: Over the course of the last few years, separate budget line items such as verification have increasingly evolved to become embedded in working media, for instance in the form of “pre-bid algorithms” and viewability segments.

Another reason may be that marketers don’t yet know or acknowledge the magnitude of the data quality problem. In their day-to-day work, many make decisions and execute campaigns using tools that make little to no attempt to convey data accuracy rates or compare quality scores. In the interest of intuitive self- and managed-service, many DSPs have made data selection so quick and easy that a user can often choose from dozens of very similar segments without being exposed to any information that ranks or differentiates them beyond the segment description, size, and price. Data providers only add to the void—in their marketing pitches, very few disclose verifiable information about the accuracy of their data.

In order to be effective, marketers need to know how various segment options stack up and measure up in terms of accuracy. Just like with increasing viewability, the first step toward a fix in data quality is realizing and acknowledging the problem.

The black box may still be opaque, but the statistics are clear. Marketers’ attention on media-related waste has driven drastic improvements over the last few years. The issue of data quality now demands that same level of focus and accountability. This time around, however, marketers don’t need to add to their tech stack or incur a new tech tax. They simply need to acknowledge the critical nature of the problem and require data vendors to prove the accuracy of the data they’re selling. Once this happens, we’ll see the same dramatic improvements in data quality that we’ve seen in media quality metrics.

Jake Moskowitz is Head of the Emodo Institute, a dedicated organization within Ericsson Emodo wholly focused on the research, education, and resolution of data concerns that mobile advertisers face.

Tags: