Home Agencies Frontiers in Attribution Modeling; Questions for Initiative EVP Bret Leece

Frontiers in Attribution Modeling; Questions for Initiative EVP Bret Leece

SHARE:

Leece of InitiativeBret Leece knows analytics. He got his start in 1995 creating econometric time series models to track and predict Sprint’s call center activity, before moving on to CRM and database marketing roles including at MarketShare. Currently Initiative’s EVP of performance analytics, he has a long view of where advertising performance has been and where it’s headed.

In this discussion with AdExchanger, he describes Initiative’s evolving approach to multi-touch attribution, the “new bottleneck” in performance-driven media, and how only two attribution vendors – Adometry and Visual IQ – were able to meet all his agency’s needs in a recent review of the category.

AdExchanger: Attribution modeling appears to be getting more attention lately. Why is that, do you think?

BL: Primarily, it’s because you’ve got a lot of different touch points with digital. The last‑click method, everybody is realizing, is just not intuitively right. Everybody’s trying to claim some sort of credit for the conversion. The next generation of attribution needs to be focused around brand metrics, attitudes and perceptions. I think everybody’s trying to get back to the promise of digital with measurability and realizing that last‑click, view‑through metrics just doesn’t give the whole picture.

Would you describe Initiative’s approach and how it has evolved?

When I joined Initiative at the end of 2010, we were just building out our digital attribution approach. So, it was primarily market‑mix models using time-series econometrics.

One thing on my plate from day one at Initiative was to get more out of multi‑touch attribution. We had done testing with log level data but we were just swimming in it rather than doing anything real. We realized that we couldn’t do it ourselves because it involved a big data challenge; we needed to find a partner.

Last summer, we went out to the marketplace. We looked at ClearSaleing, x+1, Adometry and Visual IQ, went deep into their capabilities. Our requirements were pretty high. We said, “Look, we want empirical, objective, multi‑touch attribution as one deliverable, so show us a report that provides fractional credit and the resulting ROI across all the different touches, down to the creative level.”

Then, our second requirement was, “Take that attribution model and be able to run an optimization at a very, very low level. Give us a new plan that gives me more conversions for the same budget, down to the creative level, so that I can actually then take that plan and put it into my buying systems and generate new insertion orders.” The ultimate thing for the agency is to get a better plan and be able to act on it.

What we saw was there were really only two vendors, Visual IQ and Adometry, who could deliver that. We ran the tests with Adometry. It took a while because we wanted a good-sized sample. We started ingesting data in September and then the last data period was in January, but they delivered. They gave us attribution and the new, better plan with constraints at a very granular level.

What’s happened since then is the bottleneck has shifted from analytics – which traditionally had taken too long, with results coming in too late to make a real change and impact in the execution.

Subscribe

AdExchanger Daily

Get our editors’ roundup delivered to your inbox every weekday.

How long have you worked in analytics?

I started my career in ’95 at Sprint, building models to forecast when people were calling customer service – econometric time series models. Then I went into CRM and database marketing for a while, and ticketing with entertainment. Yeah, I’ve been in analytics for a long time.  Too long!

So you have a very long view of these types of tools and what’s possible with the amount of data that’s accessible. What do you think are the key improvements in attribution modeling? What improvements still need to be made?

What we’re finding interesting is the rise of new methods, whether that’s the Bayesian probability models or agent‑based models. We’re starting to get beyond the limits of time series econometrics, which has a big problem with what’s called multicollinearity. There are statistical methods now being applied [to] very, very granular data, user level data. It’s almost a return, in a weird way, to CRM & database marketing, using those types of logistic regression choice models with user level attributes.

It is part a computer science problem because of the big data and the way you have to structure it. Then, it’s part a statistics problem. You’ve got people starting to solve these two things simultaneously. Adometry has got the statisticians but they also have computer scientists. They’re working jointly together. That’s what’s coming up as the future. I do think very highly of Adometry, but I think Visual IQ is there as well.

Quite honestly, there are always ways you can ingest more data. One of the development tasks for Adometry is to start taking some of the DMP data and looking at multi‑touch attribution in terms of the customer segments. Do males respond more than females? Does this cohort of customers respond differently or have a different path? Does brand advertising work [better] on one particular segment?

I do think that the bottleneck is going to shift to the buy insistence, and frankly, to the supply side. There’s a joke at the upfront about, it’s easier to spend $1 billion on TV than $100,000 on digital.

What, in your view, might start to ease the new bottleneck that’s showing up in buying systems?

We’re looking into that. I think it’s a combination of systems on the agency side and workflow changes that will involve the client. Imagine you’ve got an attribution model running, not necessarily in real‑time, but maybe on a daily basis. Then you crank the optimization and you’ve got a new, better plan. It’s about an approval and workflow system to get those little changes, like $500 off of this tactic, off of this publisher and creative, and put it on this publisher and creative. OK, approve that, right? How do you scale the micro changes to capture the lift in your optimized forecast?  Workflow technology.

The agency’s got a bunch of work to do, and then on the supply side, on the publisher’s side, they’re probably not ready for the small changes either.Move this. Take that banner off of that placement and put it on this placement.”  I think it’s going to take some sort of data interface to do that. Plus, then you have the cost side, where the more dynamic costs are, the more you actually have to feed that back into the whole cycle, because that does change your decisions.

In its Wave Report, Forrester rated the pure‑play attribution vendors much higher than the big analytics companies, but they see potential there for the analytics providers to catch up. Do you agree?

I think it’s going to be close. The reality is that the agency’s competitive advantage is that they have a plan. They’re basically in the slipstream of the client. They’re having daily conversations with the client. They have the plan. More and more there’s consolidation across all different types of lines to one agency. As much as IBM or Adobe want to come in with some analytics software, they don’t have the plan. The objective of the analytics should be to produce a better plan that can be acted on, on the right cadence.

Maybe if you’re a long consideration product like autos or you are a fast consideration product like a CPG or a fast food, you may optimize autos once a month because you’ve got to give time for the advertising to work. Whereas, if you’re a fast food and you’ve got a promotion that just came out and a new product, you know if it’s going to work in a matter of days or not.

The point of the analytics is to get a better plan and realize those gains that could be had with a better plan. I don’t see these technology providers figuring that part out. A lot of them are focused on just the left‑hand side, which is attribution, which can show you lots of pretty charts, but to do true optimization, you’ve got to have non‑linearity, and you have the cost, because the other thing is that the cost side, the agencies have all of the costs.

It’s going to take a lot for it to play out, but I honestly see these technology companies probably buying an Adometry or a C3 and trying to get in it that way.

Whereas three years ago only one vendor reviewed by Forrester offered algorithmic models for attribution modeling, now fully half of them do. How important to your clients is using an algorithmic approach versus last click?

I don’t think the clients care about the method. They care about, “Where am I today and how do I get more for the same budget tomorrow?” Last click optimization, you can optimize to that. It’s not necessarily wrong, but I don’t think it is taking in as much information as you can today. What happens with last click optimization is you shift towards basically the lower funnel, and that’s tough for brand marketer and most of our clients do some sort of brand marketing. You’re giving short shrift to the upper funnel stuff that does change your attitudes and perceptions, which is critical to how advertising works.

One of the reasons for multi‑touch attributions is to give some fair credit to the upper funnel stuff that’s ultimately impacting the likelihood that somebody is going to convert downstream.

Can you talk about the integration of digital with offline touchpoints? How much are you doing and how important is it to your overall modeling approach?

I think it is important, but we can’t boil the whole ocean, right?

We have to start with the in‑channel analytics for digital, get that right, and then start feeding more exposure data into trying to figure it out. I think the first thing you do is you try to account for the effectiveness of TV. You don’t necessarily try to optimize with the algorithmic method. Just start with attribution.  I think the Holy Grail is taking atomic level data and optimizing the whole consumer brand experience in media to business results. I’ve seen too many things fail and clients get disappointed, so I’m very much focused on getting digital right first.

I will say that time series econometrics is still the standard for cross channel analytics. If you want one method that measures the impact for each major media channel, econometrics is still the method.

Back to the Holy Grail, I think we’re probably five years off from algorithmically solving that. The data needs to get better before this can happen … because if you think about what we’re doing with Adometry at the cookie level, I don’t know if that cookie has been exposed to TV.

We can do a probability of TV exposure with data at the household level to cookies … that is with granular HH Rentrak data we can join to HH cookie data, but it’s still a probability. But the net is that now all of the data is at the same level and will make the Holy Grail difficult to reach.  Which is why it is a Holy Grail.

What’s your view of Facebook from a measurement and attribution standpoint?

If multi‑touch attribution continues to grow, they are going to be left in a silo because they aren’t providing the cookie level exposure data and conversion data. You’re going to have to evaluate them differently. We think FB, works, but with the standard analytics that you get out of it, you can’t really compare on the same level to other platforms or publishers.

Their research people are great to work with and they definitely know that they need to figure out a way to be open within the constraints of their privacy policy.

Any more thoughts on Adometry, how you selected them and why?

The reason why we like these guys is because they’re honest and they delivered. There’s a lot of people out there with optics and smoke and mirrors. One example of Adometry’s honesty is when they said to us, “Look, we see a spike in the brand search conversion path, that can’t be attributed to display … We think it’s TV and want to test TV exposure data.”  They get their PhD’s and M.S. people on the phone with us.  No optics.

What’s the right balance for you between relying on them for a services layer to go along with their platform – versus managing everything in‑house?  

Yeah, we’re looking at that. I will say that if you look at MAP’s (Mediabrands Audience Platform) model, it’s a very open model. They’re working with the networks or other providers. It’s akin to, “We’re not going to try to do every technical layer of the value chain here, but we are going to have a model that goes after the audience and gets the performance for our clients in terms of us investing in building it.” What we ultimately need is for it to deliver business results for our clients. If that involves a third‑party, then we’ll go figure out how to make that happen because we’re in the business of delivering business to our clients.

By Zach Rodgers

Must Read

Forrester’s SSP Wave Lists The Top 10 SSPs – With Google At The Bottom (Really)

Forrester released its first SSP wave since 2014 last week, and there’s a surprise. The research firm ranked Google – whose sell-side ad tech platform is facing federal antitrust charges – as a mere challenger.

Early Adopters Are Snapping Up Attention-Based Inventory Before Everyone Else Drives Up The Costs

Current ad pricing often doesn’t correlate to a site’s attention score, which means there’s an arbitrage opportunity for buyers and resellers.

Viant Acquires Data Biz IRIS.TV To Expand Its Programmatic CTV Reach

IRIS.TV will remain an independent company, and Viant will push for CTV platforms to adopt its IRIS ID to provide contextual signals beyond what streamers typically share about their ad inventory.

Privacy! Commerce! Connected TV! Read all about it. Subscribe to AdExchanger Newsletters

Integral Ad Science Goes Big On Social Media As Retail Ad Spend Softens In Q3

Integral Ad Science shares dropped more than 10% on Wednesday, after the company reported lackluster revenue growth and softened its guidance for the Q4 season.

Comic: Gen AI Pumpkin Carving Contest

Meet Evertune, A Gen-AI Analytics Startup Founded By Trade Desk Vets

Meet Evertune AI, a startup that helps advertisers understand how their brands and products appear in generative AI search responses.

Private Equity Firm Buys Alliant As The Centerpiece To Its Platform Dreams

The deal is a “platform investment,” in which Inverness Graham sees Alliant as a foundation to build on, potentially through further acquisitions.