Home Data-Driven Thinking Ad Tech Desperately Needs Data Exchange Standards

Ad Tech Desperately Needs Data Exchange Standards

SHARE:

Data-Driven Thinking” is written by members of the media community and contains fresh ideas on the digital revolution in media.

Today’s column is written by Sanjay Agarwal, vice president of engineering at Drawbridge.

In the ad tech ecosystem before OpenRTB, proprietary protocols caused long integration cycles, more code complexity and maintenance and custom logic for parsing requests.

The OpenRTB protocol replaced all proprietary protocols for sending ad requests from supply-side platforms (SSPs) to demand-side platforms (DSPs), resulting in faster integrations and a scaled ecosystem as a whole.

While this solved the SSP-DSP interaction almost a decade ago, there is still unfortunately no standard format for exchanging data between data management platforms (DMPs) and DSPs, posing a major point of friction in connections between DMPs and DSPs. There is a strong need for the standardization of data formats and the transport layer to push data from a DMP to a DSP.

While this problem has been around for a decade or so now, the proliferation of data providers has made standardization all the more pressing. While the DMPs and DSPs certainly stand to benefit from standardization, the real winners at the end of the day would be marketers. They would be able to more seamlessly leverage first- and third-party data, get campaigns set up quicker, more easily receive data back and never have to worry about integration timelines.

Today, the typical timeframe for adding a new integration from DMP to a DSP is anywhere from two to four months. The “standard” options for the transport layer can be via the file transfer protocol, S3 bucket or HTTP, and the data format itself can be proprietary. Product and engineering teams from both sides are involved in the integration, with calls, email threads and a lot of back-and-forth that could easily be avoided with a standardized data format.

The friction of sending and receiving data has resulted in niche DMPs using bigger DMPs like BlueKai to channel the data to media platforms. But usually this third-party onboarding results in a loss of functionality and scale, with a drop-off occurring at every data hop. Some companies also act as intermediaries to funnel the data from data producers to data consumers, which again results in loss of scale because of cookie translation. Data debugging becomes harder when third parties are involved, but unfortunately, due to today’s landscape, it’s required.

Offline transfer of data, such as through S3 or the file transfer protocol, results in less robust data validation. There is no feedback given to the sender on whether the data is received and processed correctly. Plus, if the data is time sensitive, such as first-party data used for retargeting, offline transfer will usually cause further delays, rendering the data useless.

Data formats are fragmented. Typically, audience data is organized by storing segment IDs by cookies, device IDs or email hashes. Whatever data format is chosen, it should support both adding and removing segment IDs to cookies or device IDs.

Another ideal feature would be bulk updates, where multiple user IDs can be modified, and each modification could even be accompanied by an expiration date until which the segment is valid. This could also support consumer choice and respect privacy through a standard opt-out that resonates across all IDs associated to the consumer. I could go on.

Subscribe

AdExchanger Daily

Get our editors’ roundup delivered to your inbox every weekday.

Having a standard format to send data across would drastically cut down the time taken for integration between DMPs and DSPs, resulting in a more seamless exchange of data. Plus, new integrations could be turned on with a flip of a switch, and a real-time transport layer could support time-sensitive data.

Just like OpenRTB has resulted in seamless integrations between SSPs and DSPs, the time is ripe to have a standard data format and transfer process for exchanging audience data between DMPs and DSPs. This will result in more platforms participating in the ecosystem, faster integrations and, ultimately, greater efficiencies for marketers.

The question is, who will lead the charge?

Follow Drawbridge (@Drawbridge) and AdExchanger (@adexchanger) on Twitter.

Must Read

Critics Say The Trade Desk Is Forcing Kokai Adoption, But Apparently It’s Up To Agencies

Is TTD forcing agencies to adopt the new Kokai interface despite claims they can still use the interface of their choice? Here’s what we were able to find out.

Why Big Brand Price Increases Will Flatten Ad Budgets

Product prices and marketing budgets are flip sides of the same coin. But the phase-in effects of tariffs, combined with vicissitudes of global weather and commodity production, challenge that truism.

The IAB Tech Lab Isn’t Pulling Any Punches In The Fight Against AI Scraping

IAB Tech Lab CEO Anthony Katsur didn’t mince his words when declaring unauthorized generative AI scraping of publisher content “theft, full stop.”

Privacy! Commerce! Connected TV! Read all about it. Subscribe to AdExchanger Newsletters
Comic: Gamechanger (Google lost the DOJ's search antitrust case)

Here’s Who’s Testifying During The Remedy Phase Of Google’s Ad Tech Antitrust Trial

Last week, the DOJ and Google filed their respective witness lists and the exhibit lists for the remedy phase of the ad tech antitrust trial. Lots of familiar faces!

MX8 Labs Launches With A Plan To Speed Up The Survey-Based Research Biz

What’s the point of a market research survey that could take weeks, when consumer sentiment is rollercoasting up and down every day? That’s the problem MX8 Labs aims to tackle.