The reigning paradigm for media exchange – serving free content in exchange for relevant advertising – depends on cookies. With the depreciation of the third-party browser cookie, the infrastructure for the open web will lose its cornerstone, opening digital media to a new era.
The opportunity to build a new infrastructure for media exchange has ushered in an exciting period of innovation and exploration in our industry, and I believe it will arrive at a viable solution. I agree with Scott Howe that addressability will flourish in a cookieless world, and at Adstra we have trained our own efforts on providing marketers the means to flourish along with it. The goal is clear – how the industry gets there is a matter where smart minds can differ.
But the cookie’s demise has also ushered in a period of revisionist history with respect to the world we are preparing to leave behind. The anxiety surrounding the loss of third-party cookies can make it seem like the preexisting order was some kind of Eden, a state in which data from multiple sources freely informed marketing decisioning on every level.
That’s far from the truth, of course. Access to data is highly imperfect, and the technology for orchestrating it between endpoints is even less perfect. Rather than building anew for the introduction of digital, programmatic, and identity, the data industry has been content to retrofit old technology and old business models to new challenges.
These shortcomings have suppressed the adoption of data in the marketing enterprise, to the point where, today, the minority of media is guided by any data – neither first party nor third party, neither anonymous nor people-based.
In its economic, technological and service dimensions, this old model has fallen out of step with the demands of contemporary marketers. Well before the cookie’s fate was sealed, the business model for provisioning and applying data to marketing was in urgent need of fixing.
A new cost model
Data brokers are incentivized to extract maximum rent for every application of the data they provide, and they meet this objective by charging incrementally for every discrete application of the data. The result is that it is cost-prohibitive to apply it to many marketing decisions.
This economic model also incurs a tremendous amount of operational cost and friction. It’s incumbent upon the user of data to not only pay for every additional application but also to report each use, an arduous process requiring additional time and technology. The work of producing the royalty reports is immense, and the reports are usually wrong, causing further dispute and delay. The cost of data is not matched to its value.
The right mix of technology and service
Data on its own means nothing and does nothing. It requires guidance and human intervention to actually work.
Legacy data providers hand over data in a raw and unstructured form, and then charge incremental fees for the service time required to help their clients put it to use. Often the “managed service” offerings marketed as “bespoke” or “customized” are really a form of costly over-service designed to make up for outdated technology and products.
On the other hand, a crop of new companies are seeking to solve the service problem with software. These companies are often inherently allergic to providing costly human-based service, which confound the economics of their “Rule of 40” SaaS business model. The result is powerful software that nobody can figure out how to use.
The needs of marketers fall somewhere in the middle. And those needs are not being met.
No more Band-Aids
Solving for the end of the cookie will require fundamental change – but will the change be fundamental enough to address these intractable problems? That’s the question before us today.
The rapid shifts coming to the market will certainly weed out players who predicated their data strategy entirely on the third-party cookie, but there will also be plenty of legacy data providers that stagger ahead with their old approaches still intact, retrofitted and rebranded for a cookieless future. Their technical and service shortcomings will impede marketers from orchestrating data between all the places it needs to go, and their business models will continue to pass the cost of those shortcomings on to marketers.
Introducing the Data Bureau
The problems with data go deeper than the loss of the cookie, and the largest category leaders are ill-suited to provide durable solutions. In today’s market, most category leaders have focused on specific use cases, solving discrete problems within specific media channels. For a company taking on the biggest problems in data, an entirely new category is needed.
This is why Adstra has launched as the industry’s first Data Bureau.
A Data Bureau is a company focused on enabling brands and advertisers to get more out of their existing marketing and customer experience technologies and campaigns, by better orchestrating the use of data and identity consistently across media.
A Data Bureau can ingest any form of identity (individual or household), assign a persistent ID connected to attributes data, and action against any other media or form of identity. It works independent of a brand’s choice in technology, allows for identity resolution to evolve as market conditions evolve and enables brands to meet privacy regulations by linking its data back to validated individuals or customers. The Adstra Data Bureau combines predictable cost structures with guidance from expert industry veterans, freeing the marketer to explore incremental applications of data without the burden of incremental costs.
The Data Bureau is, in other words, our response to the barriers holding back the application of data to marketing. Those barriers are much more extensive – and much longer-standing – than the question of how to solve for the death of the third-party cookie.