Low-Cost Data No Longer Means Cheap

Data-Driven Thinking" is written by members of the media community and contains fresh ideas on the digital revolution in media.

Today’s column is written by Kim Brown, founder and CEO at Centrally Human.

We all know data is an integral ingredient in today’s digital marketing and performance-based world.

Yet with so many data options and a real need for quality, how do marketers differentiate and choose? Oftentimes, marketers employ the same strategy used for buying wine in a restaurant: Look for something familiar that’s moderately priced and assume it is good quality by avoiding the least expensive options.

It’s the digital marketer’s version of the Goldilocks principle: not too expensive, not too cheap, good results (it’s worked before). And it is understandable why many marketers make this decision. The pressure for performance-driven digital marketing is incredible, and data availability has exploded. For example, I’ve heard there are approximately 70,000 audience segments from about 200 partners in one provider's data store.

There is a problem, though, with using a go-to data provider or ignoring the lowest-priced options. Marketers can now get some incredible data from new providers for the lowest of prices.

That’s because sourcing and accuracy of data has improved dramatically over the past few years due to increased consumer and business use of digital (more users = more data) and improved technology to source and organize data into usable segments. And this mass of data availability and technology has created a wave of data disrupters like Affinity Answers and 180byTwo, which are delivering high-quality data at competitive prices to win market share.

Some companies offer high-performing data that is 15% to 55% less than traditional incumbents. For example, I’ve seen pre-intent data available for $0.60 CPM compared to $1 CPM, while mobile may be purchased for $1.25 CPM versus $2.25.

With data so pervasive and easy to mine, segment and activate, the pricing structure has to change to accommodate the changing market. No longer does low-cost mean cheap.

We live in a new world of data economics: availability of high-quality data, easy-to-use technology to harness it, established distribution channels, high demand for return on digital ad spend (RODAS) and new competitive disrupters entering the space offering lower-priced comparable data offerings.

With all these market forces in play, I’d expect to see more marketers using startup data providers and demanding lower data prices across the board. But we’re not seeing it happen, and the delay is to the detriment of advertisers’ bottom lines. The sooner marketers start testing and using low-cost data, the better RODAS for brands and increased availability of budget to spend on high-quality creative and media placements.

Knowing there are incredible resource constraints along with performance pressures, there are three things marketers can start doing today. First, it is critical to drive the new conversation in their teams that low-cost data doesn’t automatically equal cheap or low-quality data.

Second, they should institute a discovery process for new data providers, which could involve a monthly or bimonthly review of their collection methodology and a microtest of their data. Many of the big partners and data management platforms offer some of this exploratory work as a service to minimize marketers’ efforts.

Finally, agency and in-house marketers should create a standard process structure to share their data-testing findings and experiences in privacy-compliant ways. Sharing what works across teams can reduce the need for redundant tests and expedite improved results. Creating a formal information-sharing process should become a focus for teams to ensure maximum RODAS.

Creating a structured process and testing six to 10 new data providers a year seems like a lot, but the return will be worth it: market pressure on the data incumbents to keep their prices competitive and the high probability they’ll discover an amazing new data provider that performs incredibly well for a fraction of the previous data cost.

Follow Centrally Human (@CentrallyHuman) and AdExchanger (@adexchanger) on Twitter.

2 Comments

  1. The fundamental problem with data in adtech is that there is no scalable way to evaluate data signals. By increasing risk to data buyers, this is slowing down the growth of the entire data industry. The root causes are (a) the archaic pricing models and (b) the extreme lack of transparency & consistency in segment definition. Those of us on the IAB Data Quality Working Group are working on (b). Alas, by itself, this will not solve the problem of optimal segment selection when DMPs offer tens to hundreds of thousands of segments and where the same logical signal, e.g., SUV intender, may be offered by dozens of separate data vendors. The good news is that, responding to pressure from buyers, some market makers (DMPs) are starting to think about strategies to improve market efficiency.

    Reply
  2. ^^ Yes. It's amazing to see marketers screaming from their rooftops about the need for transparency yet they allow data companies to sell completely blind segments, providing buyers no knowledge of the criteria and source that forms a data signal. This could and should be the next crusade in ad tech.

    Reply

Add a comment

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>