Cost vs. Value: Third-Party Targeting Data in the Demand-Side-Platform And Exchange Landscape

Aaron Reinitz of VivaKi"Ad Agents" is a column written by the agency-side - and those servicing it - of the digital media community.

Aaron Reinitz is Supervisor, Brand Relations, VivaKi Nerve Center - part of Publicis Groupe.

In display media, direct response tactics are gaining momentum toward the concept of audience buying. This approach, reliant on cookie-based targeting rather than using content as a proxy for audience, directs attention squarely toward third-party data aggregators, the core providers of this service. Most press and commentary in the data marketplace has been focused on online privacy and self regulation, but another area worth examining is the difference between the cost and value of targeting data as a product, and how disparity between the two may impact pricing in the future.

In order to best think about the concept of data cost, it is helpful to recognize that in the DSP landscape, ad impressions and targeting cookies are two separate entities, frequently provided by two wholly separate companies. A key benefit of DSPs is combining both pieces in a bidded environment in the interest of cost efficiency and granular testing. Typically, while the impression comes from an Ad Exchange like Right Media or Google AdEx, the targeting cookie comes from a provider like Blue Kai, eXelate, or TARGUSInfo.

Adding data to target a specific type of individual within the Exchanges incurs an incremental fee on top of the inventory cost; the data provider is compensated separately from the inventory provider. This is a positive, because it allows very granular testing of specific data segments in context to the inventory alone. In the past, when data and inventory were sold as a coupled product, it was difficult to tell whether the targeting or the media was having the greatest impact on performance.

That said, within a test, there needs to be an incremental lift in performance to justify that cost, or in other words, a sufficient amount of value. The disparity between cost and value within third party data is an ongoing debate in the landscape, as they are not always equal. Further, many within the industry are pushing to reconcile the two concepts in the interest of greater overall performance for advertisers.

This begs the question, how is cost being determined?  “Pricing continues to be a moving target,” says David Helmreich, VP of Business Development at TARGUSInfo, a provider of demographic, psychographic, and behavioral targeting technology in the space. “Part of the pricing challenge is that there has not been a lot of transparency into performance, and there has not been a good way for people to determine quality. I would love a world where quality is an input into the pricing equation,” he continues. Ultimately, the performance data lives within the walls of the advertiser and agency, and developing a process to communicate what’s working and what’s not back to the data provider is a very real challenge, especially with the absence of a standardized process or ratings system accepted by the industry.

Solving for this problem is something that technology providers in the space are increasingly making a priority. Zach Weinberg, President & COO of the Google owned Invite Media, a DSP in the space, asserts that “Data providers and DSPs need to get a little closer. What Invite/Google has been focusing on has been to make the purchasing process and evaluating data segments in our interface easy, so users of the product can freely experiment and test.”

That said, data companies appear to be willing to entertain the notion of marrying measures of effectiveness with the cost of their product. Mark Zagorski, CEO at eXelate, has asserted that, “If we really want to put our money where our mouth is, performance models that take into account quality aren’t out of the question. If we’re generating lift, it opens the door for us to get paid more, and will help the business grow. That way, everything moves towards a market driven and transparent pricing point, where it’s no longer a process of a sales person trying to get the most they can for a product.”

This momentum is also being driven by the agencies and advertisers themselves. Technology startups that are emerging around deeply granular measurement and reporting allow agencies to ask questions they couldn’t before. For example, just how precise are targeting segments? Can we be sure that a buy for ‘males 18-24’ is just reaching those young men? “We have found that there are some new companies such as Aggregate Knowledge that have promise in this space. We've also found the data companies themselves realize their models are evolving and are opening up to these discussions,” says Brett Mowry, Vice President/Group Director of Strategy and Analytics at Digitas. With regard to impact on cost, Mowry asserts that, “We believe the data industry will move towards a more transparent model that accounts for bleed in the data buys.  Ultimately, though, quality will likely be defined by performance and data will be priced relatively.” The ‘bleed’ that he refers to, is instances where data is imprecise.

Zagorski asserts that the data companies are already addressing these issues. “The way to combat bleed is to look at frequency. We’re capturing data constantly, and create frequency and recency  against each data point.” In other words, the more often and consistently you refresh your data, the more likely you are to be accurate. Regular auditing is important as well. “We can determine what anomalies may exist, and can determine relevancy. We typically show less than 5% of anomalies,” he continues.

As this targeting methodology continues to mature, agencies tend to recommend prudent investment in testing, analysis, and infrastructure. Further, it is important to align the performance indicators of a campaign with the tactics that make up the overall strategy. Many are beginning to find that the more appropriate applications of this data are more mid-funnel performance indicators. For example, the best use of this targeting may be to build awareness and preference among a target audience. Driving users of a certain category or segment to an advertiser’s site or offer may feed the funnel and grow a base of potential converters. After all, finding a logical way to fit all available and appropriate tactics into a media plan is the art within digital advertising, often an ongoing process relying on testing and optimization.

Follow VivaKi (@VivaKi) and AdExchanger.com (@adexchanger) on Twitter.

3 Comments

  1. Kbarbieri

    Good piece Aaron. Totally agree that performance based pricing is the way to go for 3rd party data providers. However, we need more standardization around techniques and tools for measuring this.

    Reply
  2. Granular measurement will be critical to the future of data driven advertising. We believe marketers need to understand the data they are using on a granular level and they need to be able to measure and optimize the data independently. The industry needs more transparency to accomplish this. Many ad nets tell outright lies about the sources of the data they are accessing and every data provider groups their data into a bucket cloaking the actual source of the data. Why do they do this? probably because a lot of the data in the bucket is crap.

    Reply
  3. Chris Pirrone

    Very well written Aaron. Given that we can accurately measure and review granular performance metrics, it is relatively easy to determine the value of specific data segments. Performance alone determines whether we will retry data segments and what we can afford to pay. The trick these days is measuring performance when multiple data segments are used.

    Reply

Add a comment

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>