Home Data-Driven Thinking Event-Level Data Enters The Spotlight

Event-Level Data Enters The Spotlight

SHARE:

aramchekijian“Data-Driven Thinking” is written by members of the media community and contains fresh ideas on the digital revolution in media.

Today’s column is written by Aram Chekijian, senior vice president of analytics and insights at Accordant Media.

The recent purchases of Adometry and Convertro by Google and AOL, respectively, signify anindustry that is finally taking big steps toward dismissing last-interaction and rule-based attribution models.

But there’s another exciting story behind this news that involves event-level data. The big media companies are squarely focused on the value of being able to access, manage and evaluate event-level data, captured and logged on a rolling basis, in an analytical tool kit never before possible in traditional media.

Top Down Vs. Bottom Up

The majority of media measurement in the past – even full-portfolio, econometric marketing mix modeling (MMM) – has been a mostly theoretical practice. “Actual” media impression delivery is estimated and statistically fit to sales patterns, controlled for other factors when possible, and then adjusted up to a total population level to simulate the real world. These models are top down. Due to the nature of the buys, and the laggard “actualization” of the data, it usually takes weeks – even months – to interpret and execute against it. They are macro by design and their applications are big picture and thus relatively academic in practice.

On the other hand, the current ubiquity of event-level data, which captures media exposure at the near-subatomic level, enables better analyses and faster actionability. The benefits of this bottom-up approach built on granular, observed data are numerous: Delivery estimates are no longer necessary; models become more stable; statistical significance is easier to attain; and again, results are actionable in a tight, closed-loop implementation.

Now, the bottom-up paradigm is taking hold holistically across mediums as new digital channels emerge and programmatic exchanges become more widespread, standardized and scaled. Moreover, as investment dollars continue flowing into digital in general, and programmatic in particular, the bottom-up model will begin to capture the maximum possible information about the user, as the “sample” grows toward 100%.

From Accurate Accreditation To Budget Simulation

A bottom-up analysis may be an inevitable replacement for top-down macro models.  Hopefully the fallacy of last-interaction attribution is crystal clear. Would you attribute every unit sold at Walmart to the person greeting customers at the door? Why this method took hold as a standard is a question for the ages, but it is a poor proxy for real-world analytics.

The “rule based” successor to last-interaction is an improvement only in the sense that it acknowledges that something else may have occurred, prior to that last click, which led to conversion. Unfortunately, the rules that determine the weights (first/even/last/etc.) are self-fulfilling prophecies and arbitrarily assigned.

Subscribe

AdExchanger Daily

Get our editors’ roundup delivered to your inbox every weekday.

Proper fractional attribution is the first major step toward accurately accrediting conversion. This is also a key first step toward a bottom-up analog of full-scale top-down MMM analysis.

But two things hold back its full potential as the end-all solution. First, it is generally confined to existing trackable media. And while it accurately values assists, it stops short of being a budgetary simulation tool.

The first point is changing fast as more media becomes trackable. New channels, such as addressable TV, though not necessarily bid in an RTB environment, can report impressions back on actualized time stamps and geocoordinates – potentially even IP – as opposed to traditional estimates. This facilitates complete fractional attribution of user exposure and enables full-pathway understanding of media contribution to conversion.

As for the second point, event-level data’s predictive stability will be refined, and in the future, budget simulation or “what if” scenario planning should be possible. The implementation of this practice in near real time provides for on-the-fly optimization, as well as budget forecasting exercises that can be tested and verified more responsively.

This new paradigm of fractional attribution and econometric modeling is no longer a science experiment. In the historical absence and subsequent underutilization of this granular data, estimates, assumptions and rules were the best available practice. These inferential analyses are no longer necessary to understand media efficacy; marketers now can invest, measure, simulate and optimize in closer to real time than ever previously possible.

Follow Accordant Media (@Accordant) and AdExchanger (@adexchanger) on Twitter.

Must Read

NYT’s Ad And Subscription Revenue Surge As WaPo Flails

While WaPo recently lost 250,000 subscribers due to concerns over its journalistic independence, NYT added 260,000 subscriptions in Q3 thanks largely to the popularity of its non-news offerings.

Mark Proulx, global director of media quality & responsibility, Kenvue

How Kenvue Avoided $3 Million In Wasted Media Spend

Stop thinking about brand safety verification as “insurance” – a way to avoid undesirable content – and start thinking about it as an opportunity to build positive brand associations, says Kenvue’s Mark Proulx.

Comic: Lunch Is Searched

Based On Its Q3 Earnings, Maybe AIphabet Should Just Change Its Name To AI-phabet

Google hit some impressive revenue benchmarks in Q3. But investors seemed to only have eyes for AI.

Privacy! Commerce! Connected TV! Read all about it. Subscribe to AdExchanger Newsletters

Reddit’s Ads Biz Exploded In Q3, Albeit From A Small Base

Ad revenue grew 56% YOY even without some of Reddit’s shiny new ad products, including generative AI creative tools and in-comment ads, being fully integrated into its platform.

Freestar Is Taking The ‘Baby Carrot’ Approach To Curation

Freestar adopted a new approach to curation developed by Audigent that gives buyers a priority lane to publisher inventory with higher viewability and attention scores than most open-auction inventory.

Comic: Header Bidding Rapper (Wrapper!)

IAB Tech Lab Made Moves To Acquire Prebid In 2021 – And Prebid Said No

The story of how Prebid.org came to be – and almost didn’t – is an important one for the industry.