“Data-Driven Thinking” is written by members of the media community and contains fresh ideas on the digital revolution in media.
Today’s column is written by Aram Chekijian, senior vice president of analytics and insights at Accordant Media.
The recent purchases of Adometry and Convertro by Google and AOL, respectively, signify anindustry that is finally taking big steps toward dismissing last-interaction and rule-based attribution models.
But there’s another exciting story behind this news that involves event-level data. The big media companies are squarely focused on the value of being able to access, manage and evaluate event-level data, captured and logged on a rolling basis, in an analytical tool kit never before possible in traditional media.
Top Down Vs. Bottom Up
The majority of media measurement in the past – even full-portfolio, econometric marketing mix modeling (MMM) – has been a mostly theoretical practice. “Actual” media impression delivery is estimated and statistically fit to sales patterns, controlled for other factors when possible, and then adjusted up to a total population level to simulate the real world. These models are top down. Due to the nature of the buys, and the laggard “actualization” of the data, it usually takes weeks – even months – to interpret and execute against it. They are macro by design and their applications are big picture and thus relatively academic in practice.
On the other hand, the current ubiquity of event-level data, which captures media exposure at the near-subatomic level, enables better analyses and faster actionability. The benefits of this bottom-up approach built on granular, observed data are numerous: Delivery estimates are no longer necessary; models become more stable; statistical significance is easier to attain; and again, results are actionable in a tight, closed-loop implementation.
Now, the bottom-up paradigm is taking hold holistically across mediums as new digital channels emerge and programmatic exchanges become more widespread, standardized and scaled. Moreover, as investment dollars continue flowing into digital in general, and programmatic in particular, the bottom-up model will begin to capture the maximum possible information about the user, as the “sample” grows toward 100%.
From Accurate Accreditation To Budget Simulation
A bottom-up analysis may be an inevitable replacement for top-down macro models. Hopefully the fallacy of last-interaction attribution is crystal clear. Would you attribute every unit sold at Walmart to the person greeting customers at the door? Why this method took hold as a standard is a question for the ages, but it is a poor proxy for real-world analytics.
The “rule based” successor to last-interaction is an improvement only in the sense that it acknowledges that something else may have occurred, prior to that last click, which led to conversion. Unfortunately, the rules that determine the weights (first/even/last/etc.) are self-fulfilling prophecies and arbitrarily assigned.
Proper fractional attribution is the first major step toward accurately accrediting conversion. This is also a key first step toward a bottom-up analog of full-scale top-down MMM analysis.
But two things hold back its full potential as the end-all solution. First, it is generally confined to existing trackable media. And while it accurately values assists, it stops short of being a budgetary simulation tool.
The first point is changing fast as more media becomes trackable. New channels, such as addressable TV, though not necessarily bid in an RTB environment, can report impressions back on actualized time stamps and geocoordinates – potentially even IP – as opposed to traditional estimates. This facilitates complete fractional attribution of user exposure and enables full-pathway understanding of media contribution to conversion.
As for the second point, event-level data’s predictive stability will be refined, and in the future, budget simulation or “what if” scenario planning should be possible. The implementation of this practice in near real time provides for on-the-fly optimization, as well as budget forecasting exercises that can be tested and verified more responsively.
This new paradigm of fractional attribution and econometric modeling is no longer a science experiment. In the historical absence and subsequent underutilization of this granular data, estimates, assumptions and rules were the best available practice. These inferential analyses are no longer necessary to understand media efficacy; marketers now can invest, measure, simulate and optimize in closer to real time than ever previously possible.
Follow Accordant Media (@Accordant) and AdExchanger (@adexchanger) on Twitter.