This year, ATS London saw a panel of industry experts discuss the changes in how publishers are monetizing content through evolving advertising practices. The panel consisted of Karen Eccles, Director of Digital Sales and Innovation at The Telegraph; Damon Reeve, CEO, The Ozone Project; Jason Trout, EMEA MD, Unruly; and Jourdain Casale, VP of Global Intelligence at Index Exchange.
Bid Caching: The Elephant in the Room
First up was a discussion about bid caching, championed by Index Exchange and a very contentious subject that also drew reactions from the audience. Casale defended the innovation as a way to better understand audiences and engagement using first-party data to drive performance. He conceded that it shouldn’t have been taken to market without clear communication to buyers and sellers and confirmed that it has been deactivated. This proved to be a recurring theme on the panel — a call for technology companies to work more closely and transparently with publishers.
Casale argued bid caching is intended as a solution to the crowded header bidding mechanism. He stated header bidding solves the problem of auctions being decided around factors that determine whether a bid would make a publisher’s ad server, thus enhancing the user experience. However, work was needed to minimize page load times and maximize the percentage of bids that reach publishers’ ad servers and this was their motivation behind bid caching.
Casale concluded by saying that Index Exchange intends to bring the feature back but only after clear communication and transparency about how it works and functions that allow buyers more control.
Bid caching was not the only issue that sparked conversations about transparency. Reeve triggered a conversation about commercial transparency by highlighting the ongoing issue of publishers being fettered by commercial terms that are dictated by those further up the chain: they had no control over and, ultimately, reduced a publisher’s revenue. He explained that the Ozone Project has been created to enable publishers to act as they should be able to — as the stakeholders of the user experience. He cited bid caching as a good example of a lack of collaboration, which ultimately harmed publishers through no fault of their own.
Eccles shared with the audience work that had been done at The Telegraph between their content and advertising teams, which had resulted in a 4x improvement in page load time compared to 12 months ago. Commenting on the lack of transparency, Eccles called on the industry to engage in clear communication about how all the technologies in place allowed publishers to have control of what was happening to their inventory and what their customers are buying.
Auction Mechanics and Marketplaces
Transparency remained at the heart of the conversation as the panel shifted to a discussion about first-price vs. second-price auctions. Reeve commented that it is paramount that buyers know the terms under which they are buying media. He explained that in today’s multi-tiered world, there is a need for the option of first- and second-price auctions. For example, he commented that in a primary auction the only fair option is a second-price auction, whereas in open auctions it makes sense for a first-price auction model to be used. The panel all agreed that the auction dynamics of the future must be designed so that media is valued fairly and transparently.
The discussion shifted to the relative merits of open- and closed-markets, with the starting point being how to achieve a balance of scale and quality. Casale proffered that data is the key to overcoming the lack of scale available in closed markets. Having a rich user graph, where marketers can overlay first-party data onto programmatic buying data, allows marketers to increase reach within a narrow audience segment. He pointed out that fragmentation of the data technology landscape is a challenge here and that without consolidation this will continue to be a problem.
Data Protection and GDPR
GDPR has caused major waves in the programmatic industry across the supply chain as publishers, technology vendors, data companies, and brands adapt to the stricter regulations for collecting and processing user data. The panel agreed that the new legislation has made it harder to achieve scale and demand for managed services has increased.
Increasing concerns about brand safety are also having a concurrent effect on the ability to achieve scale. The conversation then pivoted back to the issue of trust and the panel agreed that guaranteed buys in programmatic are becoming increasingly popular as they create an atmosphere of trust and shared value across the sell- and buy-side.
There is a clear desire to solve the issue of scale and transparency to avoid going around in circles trying to solve the same problem over and over again. Ultimately, it’s about putting in place technology and relationships that enable collaborative determination of true market value and a frictionless relationship between buyers and sellers. This, the panel agreed, will drive trust and increased investment, helping shift the perception that programmatic is just a cheap way to buy impressions.
The overriding theme of the panel was that all players in the supply chain need to collaborate and create an environment where trust is paramount, and the integrity of scale and quality is no longer questionable. Only then will the wider industry be able to take on Google and Facebook in a quest to secure more than 20% of digital ad spend and reduce the control the duopoly has.
We’ve seen the recent open letter in the US calling for the development of principles to build clarity and trust in programmatic. This ties in very well with what the panel was demanding and it will be interesting to see how this initiative develops going forward.