When Waffle House locations start to close, it’s a sign of severe weather on the horizon. When there’s a surge in pizza orders near the Pentagon, a potential international crisis may be looming.
And when there’s an uptick in the size of Ads.txt files, it’s a pretty good indication that bid duplication is on the rise.
According to Jounce Media, the number of authorized Ads.txt entries has tripled since 2020. Web publishers initiate roughly 30 million ad auctions per second and, on average, media sellers work with 24 SSPs more than half of which participate in resold auctions. (Who can even name 24 SSPs off the top of their head?)
As chaotic as that sounds, it’s actually rational behavior on the part of publishers, said Chris Kane, CEO and founder of Jounce, speaking at AdExchanger’s Programmatic I/O event in NYC last week.
Publishers are trying to do whatever they can to maximize their share of demand.
But buyers never see most of these requests.
‘All sorts of chaos’
Just because publishers initiate more auctions doesn’t mean DSPs are listening to more bids.
In fact, Kane said, most DSPs are trying to listen to fewer bid requests as a way to save money, because the bandwidth costs add up.
“There is no DSP – not one – that listens to 30 million bid requests per second,” Kane said. Which means greater auction density doesn’t actually lead to better monetization. The average DSP only listens to around 3 million queries per second (QPS).
Capping QPS is also an example of rational behavior on the part of buyers contending with rampant bid duplication. And rampant bid duplication is, in turn, a rational reaction to volume bias.
So, everyone’s acting rationally. Buyers are protecting themselves, and publishers are just trying to get by.
But because all this rational behavior is in response to an irrational and untenable dynamic, it “creates all sorts of chaos in the supply chain,” Kane said.
Bidstream congestion
Another problem is algorithmic.
DSP algorithms tend to have a volume bias, meaning they favor publishers with more impressions for sale on the assumption that these publishers are larger and therefore have more valuable inventory.
That’s one of the main reasons publishers run multiple duplicative auctions. It’s an attempt to boost their volume and capture more bids by appealing to the algorithm.
But DSPs aren’t increasing their QPS in response.
The result, Kane said, is “a crowding out effect,” whereby publishers are fiercely competing with their peers simply for the opportunity to be made available to DSPs.
This phenomenon, which Kane called “bidsteam congestion,” is a problem because it incentivizes bid duplication and disadvantages publishers that try to cut down on waste.
Publishers that switch off rebroadcast auctions – as in, resold auctions with unnecessary hops – get squeezed out.
You are what you eat
Meanwhile, DSPs end up with a distorted picture of the supply chain.
Unless a DSP listens to the entire bidstream – which isn’t feasible or realistic – they see only supply chains that match their existing buying patterns.
Because DSPs purposely limit their QPS, supply-side platforms filter the bidstream to choose which subset of their bid requests to send to each DSP. To help guide them, SSPs “try to figure out the characteristics of supply that each DSPs seems to like,” Kane said.
In other words, SSPs “feed DSPs what they eat,” he said.
If a DSP mostly buys app inventory, SSPs will play into that propensity and mostly show mobile app placements. The same goes for web supply. That’s logical. But applying the same “you are what you eat” logic to other types of supply amplifies the crowding-out effect.
For example, some DSPs don’t buy resold auctions. When SSPs figure that out, they stop sending those types of requests. The opposite is also true. DSPs that buy a lot of resold auctions end up seeing more and more resold inventory.
“If you’re a publisher that has turned off rebroadcasting,” Kane said, “you get totally crowded out of this DSP.”
The result is that publishers have to fight tooth and nail just for the chance to be seen by DSPs, while buyers get a limited and skewed view of available inventory.
‘Skinny pipes’ vs. ‘fat pipes’
One way to fix this doom loop of duplicate programmatic bidding, according to Kane, is for DSPs to stop using so-called “skinny pipes” and move to a smaller handful of “fat pipes.”
If a typical mid-market DSP caps itself at 3 million QPS and has integrations with, say, 30 SSPs (which is a normal setup by today’s standards), that DSP likely allocates around 100,000 QPS to each of its SSP partners.
In that scenario, it’s as if the DSP is operating 30 constricted (aka skinny) pipes, each with a narrow view of the bidstream.
Instead, Kane said, it would be “much, much better” for the DSP to allocate way more capacity to a handful of SSP integrations, say 1 million QPS apiece across three “fat pipes” for a broader view of the bidstream. It can do this without increasing its total QPS.
If DSPs begin to operate this way – and in Kane’s view, “this is where market forces are going to pull the industry” – it would discourage wasteful forms of reselling.
“As DSPs rip out inefficient broadcasting supply chains, and as they listen to the entirety of the bidstream through efficient supply chains, the same amount of money will leave DSPs,” Kane said. “Less of that money will be taken by supply-chain fees and more of that money will be paid out to publishers.”
Which sounds good, but not for everyone. DSPs that don’t turn off rebroadcast auctions will slowly go out of business, Kane said.
This isn’t a trivial problem to solve, and there are a lot of “understaffed and poorly managed” smaller and mid-market DSPs that don’t fully appreciate how this affects their business, Kane said. They wind up listening to a boatload of duplicated bids that only represent a sliver of available supply.
And so it shouldn’t be surprising if more marketers shift their budgets to larger omnichannel DSPs like The Trade Desk that never listen to resold auctions.
“I think it’s already happening,” Kane said.