“Data-Driven Thinking” is written by members of the media community and contains fresh ideas on the digital revolution in media.
Today’s column is written by Ruben Schreurs, CEO at Digital Decisions.
In the early 1950s, Malcolm Purcell McLean stood in a massive line of trucks waiting to be offloaded by dock workers. The tedious process involved buckets of sweat and universal backache for the many men that formed the highly unionized dock crews.
He had a light bulb moment: What if, instead of the inefficient and time-consuming process of offloading the contents of his truck into storage facilities, where it would sit until it was loaded onto a ship or another truck, possibly multiple times, a pulley or crane system could just lift his entire trailer off the truck’s chassis? The trailer, later called a container, could be sealed and delivered in its entirety, eliminating many manual off- and on-loading steps, improving shipment speed and reducing cargo theft.
In the programmatic ecosystem, data, just like old-school cargo such as sand, grain or steel, is a commodity. But for such a modern commodity, we are arguably in worse shape than the cumbersome pre-container logistics era.
Data suffers from a lack of standardization. We have broadly accepted working with a patchwork of APIs and other middleware that tries to connect data from system A to system B. In their recent study, ISBA and PwC rightly called out the need for standardization of platform data in order to create a more efficient and transparent programmatic supply chain. Only being able to match 11% of all data, and within that having a 15% unknown delta, makes a clear business case for the need of standardization.
Standardization was the key that ensured containers would become a cornerstone of freight transport once Malcolm successfully proved the concept. Many competitors had seen the potential in the container model, but to design a ship capable of fitting all of the many different sizes of containers was practically impossible, not to mention the truck trailer chassis across the world that would have to fit the containers for land transport.
There was a massive competitive bidding process. Global standardization offices fell over each other to claim the universal standards, and commercial firms tried to get their existing standards to become the global norm, so that they would have a major head start on their competition, not having to write off their redundant containers and invest in new inventory.
The standardization process completed, and a number of container and fitting sizes were globally approved to be the norm. And this changed everything.
Today I hope that someone might read this and recognize themself in Malcolm’s personality and create the ad tech container that will leapfrog our industry into the next generation of efficient data management and sharing across platforms and partners.
The main components of the standardization effort in our industry would include a clear and uniform data taxonomy specifying standard dimensions and metrics in a way they can be easily matched across platforms. There should be clear guidelines for the access and export capabilities that should be offered by platforms. To wrap it all together should be the ability to export data structurally using an easy-to-manage and highly portable data container.
Considering there are several leading analytics platforms, a common adapter standard to plug in the account seat without having to set up a network of API connections would be a major win in addition to this, but an initial major leap forward would be a standardized CSV template that can be automatically delivered on a schedule.
The RTB protocol has helped create some level of matchability of data across platforms, but there is still a high level of variation in terms of the level of data granularity different platforms make available in their reporting interface, and in what format. As soon as we have a clear industry standard we will unlock advertisers’ and publishers’ ability to properly manage and assess their activity using different partners – all the way through the ecosystem.
It will take some lobbying and applying pressure in the right places, but from a technological point of view, a standardized “data container” would be straightforward to design and implement for the platforms that drive our industry ecosystem.
If you are interested in reading more about Malcolm McLean, I highly recommend “The Box: How the Shipping Container Made the World Smaller and the World Economy Bigger,” by Marc Levinson. It’s an amazing story with a wide merit on innovation principles across any industry.
Follow AdExchanger (@adexchanger) on Twitter.