From Fragmented Martech Stacks to Unified Data Platforms as a foundation for AI | Martech Edge | Best News on Marketing and Technology
GFG image
From Fragmented Martech Stacks to Unified Data Platforms as a foundation for AI

marketingartificial intelligence

From Fragmented Martech Stacks to Unified Data Platforms as a foundation for AI

MTEMTE

Published on 13th Apr, 2026

 

 

Q1. The industry is clearly moving away from fragmented martech stacks. What are the main limitations you've observed with traditional setups involving DMPs, CDPs, and data clean rooms?

These tools were never designed to work together; they were built to solve different problems for different segments of the media industry at different points in time. DMPs were built mainly for publishers navigating the third-party cookie era. CDPs came along to fix the

single-customer-view problem for brands internally. Data clean rooms were adopted in response to signal loss across the board by brands, publishers, and retailers alike. So you’re looking at three separate architectures, three vendor relationships, three data pipelines.

What we hear constantly from publishers and retailers is that stitching these together creates enormous operational drag. Every handoff between tools is a point of latency, a potential compliance risk, and a cost center. And because none of them were built with collaboration in mind from the start, the moment you try to do something cross-party (enrichment with a partner's data, joint measurement, audience activation beyond your own properties, etc.) you hit a wall. The stack simply wasn't designed for the collaboration era, and even less for AI.

 

Q2. What is driving organizations to adopt more unified and flexible data platforms today, and how urgent is this shift?

Three pressures are converging simultaneously, which is what makes this moment feel different from earlier transitions.

First, regulation has fundamentally changed what's permissible. GDPR and a growing body of case law have made clear that moving customer data freely between systems is over: organizations need technical guarantees, not just contractual ones, for hassle-free and fast collaboration. Second, the signal environment has decreased: third-party cookies are declining, and universal identity solutions have helped at the margins but haven't filled the gap. Third  and most importantly the value of first-party data is now demonstrably tied to collaboration. Data sitting in one organisation's DMP is interesting. Connected to a brand's CDP or a retailer's transaction history, it becomes genuinely powerful.

The media players moving now are building structural advantages. Those waiting are watching legacy DMP contracts come up for renewal with no clear answer for what replaces them.

 

Q3. From your perspective, what does a truly "unified" data platform look like in practice, beyond just integrating multiple tools?

"Unified" gets used to mean fewer vendor logos on a slide. That's not what I mean in this case necessarily.

A truly unified platform is one where the architecture was designed from the start for collaboration and privacy with the goal of creating networks between data owners, not just optimising data within a single organisation. When a CDP or DMP adds a clean room module, the privacy guarantees are only as strong as the wrapper. Additionally, you don't necessarily inherit any network here either, meaning each partnership might have to be built from scratch.

At Decentriq, we started from the opposite direction. Our clean room uses confidential computing: hardware-level encryption where data remains protected during processing, even from us. Using that as a foundation, we built the Collaborative Audience Platform: a unified layer adding CDP- and DMP-style capabilities  segmentation, identity resolution, activation, shared audience products. In practice, a publisher can collect data, build and enrich audiences, activate to GAM or DSPs, run closed-loop measurement, and refresh automatically all in one environment, with no seams between layers. That's what genuinely unified looks like.

 

Q4. Many companies still rely on stitching together multiple solutions. Where do these approaches typically fall short when it comes to scalability and efficiency?

The failures tend to only become visible at scale, which is precisely when they're most painful.

 

The first is the identity tax. Every time data moves between tools, you make assumptions about identity resolution. If your system can only handle one ID type, you can lose a significant portion of your audience during matching. The second is engineering overhead: stitched integrations need constant maintenance, and onboarding each new partner is its own project, meaning there is a hard ceiling on how many collaborations you can run in parallel. The third, which comes up in almost every conversation with publishers replacing their DMP, is the inability to operationalize collaboration at scale. One-off clean room projects are feasible. Repeatable, automated, always-on audience collaboration with multiple partners simultaneously is a different problem (and stitched stacks weren't designed for it).

 

Q5. How is this shift impacting data collaboration between brands, publishers, and retailers in real-world scenarios?

The most significant change is the move from one-to-one integrations to network-based collaboration, because this changes the economics of data entirely and provides a crucial foundation for AI.

In the old model, a publisher ran a bespoke clean room project with one advertiser at a time. High cost, limited scale. A platform model enables something fundamentally different: standardised, repeatable collaborations across a growing network simultaneously. We've seen this with OneLog in Switzerland using our technology: five publishers unified under a single audience monetization platform, enabling advertisers to plan, activate, and measure across their combined audiences.

We're seeing the same dynamic for retailers. Decentriq's Collaborative Audience Platform lets them build audiences from online and offline signals and activate with brands and premium publishers (including CTV) without raw transactional data ever leaving their control. For brands, this means accessing publisher and retailer audience data through a standardized, privacy-safe workflow instead of negotiating lots of separate agreements.

 

Q6. Privacy and compliance remain key concerns. How do modern unified platforms address these challenges more effectively than legacy martech stacks?

 

Legacy stacks address privacy primarily through contracts — data processing agreements, retention policies. These are necessary but not sufficient. Contracts tell you what should happen; they don't technically prevent what shouldn't.

Decentriq uses confidential computing as the central technology for data collaboration: a hardware-level technology where data is processed inside a secure enclave inaccessible to any party, including us. The privacy guarantee is technical, not contractual. A significant recent CJEU ruling validated exactly this approach: clarifying that pseudonymised data processed through technology where re-identification is technically impossible carries a different compliance profile than data protected only by agreement.

For organizations navigating GDPR, this shifts the burden dramatically: instead of documenting every data flow and relying on ongoing contractual enforcement, you can demonstrate provable technical compliance. That's increasingly what regulators, legal teams, and enterprise procurement are demanding.

 

Q7. What role does AI and automation play in enabling more seamless and actionable data collaboration within these new ecosystems?

The critical point is where AI runs. AI operating on raw data is a privacy risk. AI operating inside a confidential computing environment on data that is never exposed  is a fundamentally different proposition.

At Decentriq, AI is embedded at several levels: lookalike modelling that extends a seed audience without either party revealing their underlying data (a luxury automotive brand saw

+80% engagement and +58% conversion rate using this, for example), audience size estimation before a segment is built, and automated refresh cycles that keep audiences current across partners without manual intervention.

Further out, the more AI is integrated into these environments, the more the collaboration network itself learns  from joint activations, measurement results, and partner interactions  rather than resetting with each new campaign. That's the direction this is heading.

 

Q8. Looking ahead, what key changes do you expect in how organizations approach data infrastructure and collaboration over the next 2–3 years?

 

Three shifts feel clear.

 

First, stack consolidation. Organisations running separate DMPs, CDPs, and clean rooms will consolidate around platforms that do two, if not all three three, natively. The maintenance cost, compliance complexity, and operational drag will drive that decision.

Second, the ecosystem model becomes the norm. The value of first-party data is increasingly defined not by how much you have, but by how well it connects. Publishers contributing audiences to a collaborative network unlock revenue that's unavailable to those working in isolation. Retailers whose data can activate across a premium publisher network and close the loop with sales measurement are in a completely different competitive position. That logic will only accelerate. And as AI becomes more deeply embedded in these workflows, the network itself becomes a training asset: the more data flows through a shared collaborative infrastructure, the smarter and more precise the models that power lookalike targeting, audience estimation, and measurement become. Isolated stacks simply can't compete with that.

Third, privacy-preserving infrastructure shifts from differentiator to baseline expectation. Confidential computing and hardware-level privacy guarantees are currently seen as advanced or optional. In 2–3 years, driven by regulation, enterprise procurement standards, and demonstrated risk of alternatives, they'll be standard requirements. The organisations betting on these foundations now will be ahead of that curve rather than catching up to it.

 

REQUEST PROPOSAL