News | Marketing Events | Marketing Technologies
GFG image

News

NRF Taps Insider One to Power AI-Driven Engagement for Retail’s Biggest Global Event

NRF Taps Insider One to Power AI-Driven Engagement for Retail’s Biggest Global Event

artificial intelligence 9 Jan 2026

 

When the National Retail Federation (NRF) prepares to host 40,000 retailers from around the world at its flagship Retail’s Big Show, the challenge isn’t visibility—it’s relevance. In an era where large-scale events compete in a crowded, digital-first attention economy, NRF is rethinking how it attracts, engages, and personalizes experiences for a massive and diverse audience.

To do that, the world’s largest retail trade association has standardized its customer engagement stack on Insider One, an AI-native omnichannel experience and customer engagement platform. The move reflects a broader shift in event marketing: away from generic campaigns and toward real-time, data-driven personalization at scale.

Why NRF needed a new engagement model

For years, major industry events relied on broad messaging, static websites, and email-heavy outreach to drive registrations. That approach no longer works—especially for global audiences with different roles, interests, regions, and expectations.

NRF’s audience spans retailers, brands, technology providers, analysts, and executives across multiple continents. Each group engages differently, consumes different content, and values different outcomes from the event. Treating them as a single cohort risks lower registrations, weaker engagement, and missed opportunities.

Insider One entered the picture as NRF sought to unify its fragmented digital ecosystem and activate real-time data across every touchpoint—from first website visit to post-event engagement.

One platform, unified data foundation

A key part of the partnership was consolidation. NRF integrated its Retail’s Big Show, NRF Protect, and NRF corporate websites into a single Insider One panel. This created a centralized foundation for managing experiences, campaigns, and experimentation.

By connecting Segment data, NRF embedded registration and behavioral insights directly into the personalization layer. Instead of siloed analytics and disconnected tools, engagement decisions are now informed by live user behavior and historical data.

This matters because scale introduces complexity. At tens of thousands of attendees, even small inefficiencies—irrelevant messaging, poorly timed nudges, or generic content—can significantly impact conversion and engagement metrics.

Driving registrations with AI-led segmentation

At the core of Insider One’s value for NRF is AI-powered segmentation and orchestration. Rather than manually defining static audience lists, NRF can activate dynamic segments based on behavior, intent signals, and engagement patterns.

That enables what marketers often promise but struggle to deliver: the right message, to the right audience, at the right moment.

For NRF, this translates into faster campaign execution, reduced friction in the registration journey, and higher-impact outreach across channels. Messaging can adapt in near real time as users interact with content, revisit pages, or show interest in specific themes such as AI, supply chain, or digital commerce.

The result is a registration strategy that feels more like a guided journey than a broadcast campaign.

Expanding engagement beyond email

Email remains important, but it’s no longer sufficient on its own—especially for time-sensitive event communication. To deepen engagement, NRF introduced web push notifications as a new channel through Insider One.

The channel quickly grew into a highly engaged subscriber base, allowing NRF to deliver timely, personalized updates before, during, and after the event. Web push is particularly effective for reminders, deadline-driven messaging, and content highlights—areas where inbox fatigue often limits impact.

NRF has also rolled out a series of high-impact on-site experiences, including:

  • Countdown banners highlighting registration deadlines

  • Slide-out bars promoting key sessions and content tracks

  • An upcoming experience designed to drive mobile app downloads, extending engagement into the on-site and post-event phases

Together, these elements reflect a shift toward always-on engagement, where each interaction builds on the last rather than resetting with every campaign.

Personalization at scale through experimentation

Personalization at NRF isn’t static—it’s experimental by design. Using Insider One, the organization is actively testing and optimizing on-site experiences to understand what resonates with different audience segments.

Recent experiments include experiences tailored for international audiences, visibility for AI-focused content, and tests around blog placement and content discovery. Additional experiments are planned to refine both content performance and advertising outcomes.

This test-and-learn approach mirrors what leading retailers themselves practice. NRF isn’t just talking about innovation on stage; it’s applying the same principles to its own digital operations.

Smarter content discovery with AI recommendations

Looking ahead, NRF plans to deploy Insider One’s Smart Recommender to personalize content discovery across its blogs and digital properties.

Rather than serving the same articles to every visitor, Smart Recommender uses prior engagement and individual interests to surface relevant content. For an organization with a deep content library and diverse audience, this can significantly increase dwell time, repeat visits, and perceived value.

It also positions NRF to extend personalization well beyond event marketing—into year-round thought leadership, research, and member engagement.

What this signals for event marketing and MarTech

NRF’s adoption of Insider One reflects a larger trend reshaping the MarTech and event marketing landscape.

Large-scale events are increasingly treated like digital products, not one-off campaigns. That means continuous optimization, omnichannel engagement, and AI-driven decision-making are becoming table stakes rather than differentiators.

Platforms that unify data, orchestration, experimentation, and personalization are gaining ground over point solutions. At the same time, AI-native systems are reducing the manual effort required to manage complexity at scale.

For MarTech vendors, NRF’s move is also a signal. Even organizations with strong brands and guaranteed attendance are investing heavily in personalization—not to attract attention, but to deliver relevance.

Built for what’s next

As AI becomes a foundational layer of customer engagement, NRF is applying the same innovation mindset it promotes across the retail industry to its own operations.

With Insider One, NRF isn’t just promoting Retail’s Big Show. It’s demonstrating how large organizations can orchestrate personalized, data-driven experiences across massive audiences—without sacrificing speed or scale.

In doing so, NRF is setting a new benchmark for how global events are marketed and experienced in an AI-first world.

Get in touch with our MarTech Experts.

 

Vertiv’s Frontiers Report Maps How AI Is Rewriting the Data Center Playbook

Vertiv’s Frontiers Report Maps How AI Is Rewriting the Data Center Playbook

artificial intelligence 9 Jan 2026

Artificial intelligence is no longer just another workload inside the data center—it’s reshaping the data center itself. That’s the central message of Vertiv’s new Frontiers report, which examines how macro forces tied to AI are driving fundamental changes in how facilities are designed, powered, cooled, and operated.

Drawing on expertise from across Vertiv’s engineering and technology teams, the report expands on the company’s annual data center trends outlook, offering a deeper look at the forces pushing the industry toward what Vertiv increasingly describes as “AI factories.” These facilities are denser, faster to deploy, and more tightly integrated than anything that came before them.

“The data center industry is continuing to rapidly evolve how it designs, builds, operates and services data centers, in response to the density and speed of deployment demands of AI factories,” said Scott Armul, Vertiv’s chief product and technology officer. According to Armul, cross-technology pressures—especially extreme densification—are accelerating shifts toward higher-voltage DC power, advanced liquid cooling, on-site energy generation, and digital twins.

Together, these changes point to an industry in the middle of a structural reset.

The macro forces reshaping data centers

At the heart of the Frontiers report is Vertiv’s identification of four macro forces that are redefining data center innovation.

First is extreme densification, driven primarily by AI and high-performance computing workloads. GPU-rich racks are pushing power densities far beyond what legacy facilities were designed to handle, stressing everything from power distribution to cooling systems.

Second is gigawatt scaling at speed. AI demand is forcing operators to deploy capacity faster—and at larger scale—than ever before. Hyperscale-style growth is no longer limited to cloud giants; enterprises, governments, and AI-native companies are now planning facilities measured in hundreds of megawatts or more.

Third is the idea of the data center as a unit of compute. In the AI era, facilities can no longer be treated as collections of loosely coupled systems. Power, cooling, IT, and software must be designed and operated as a single, tightly integrated platform.

Finally, silicon diversification is complicating infrastructure planning. Data centers must now support a growing mix of CPUs, GPUs, accelerators, and custom silicon, each with different power, cooling, and operational profiles.

These forces set the stage for five technology trends that Vertiv believes will define the next phase of data center evolution.

Powering up for AI

Power architecture sits at the center of the AI data center challenge. Most facilities today still rely on hybrid AC/DC power distribution, with multiple conversion stages between the grid and the IT rack. While proven, this approach introduces inefficiencies that become increasingly problematic as rack densities climb.

AI workloads are exposing those limits. According to Vertiv, the industry is moving toward higher-voltage DC power architectures, which reduce current, shrink conductor size, and eliminate some conversion stages by centralizing power conversion at the room level.

Hybrid AC/DC systems remain common, but as standards mature and equipment ecosystems develop, full DC architectures are expected to gain traction—especially in high-density environments. The shift is further reinforced by on-site generation and microgrids, which naturally align with DC-based distribution.

In practical terms, power design is becoming less about incremental efficiency gains and more about enabling scale. Without rethinking power delivery, gigawatt-class AI deployments simply won’t be feasible.

Distributed AI changes where compute lives

The first wave of AI investment focused heavily on centralized hyperscale data centers built to train and run large language models. But Vertiv’s report suggests the next phase will be more distributed.

As AI becomes mission-critical, organizations will make more nuanced decisions about where inference workloads run. Factors such as latency, data residency, security, and regulatory compliance are pushing some industries toward on-premises or hybrid AI environments.

Highly regulated sectors—including finance, defense, and healthcare—are prime examples. For these organizations, sending sensitive data to public clouds may not be an option, even if cloud-based AI services are readily available.

Supporting distributed AI requires flexible, scalable infrastructure, particularly high-density power and liquid cooling systems that can be deployed in new builds or retrofitted into existing facilities. This trend blurs the traditional line between hyperscale and enterprise data centers, bringing AI-class infrastructure closer to the edge.

Energy autonomy accelerates

Resiliency has always required on-site power generation, but AI is changing the equation. Power availability—not just reliability—is becoming a limiting factor for new data center projects in many regions.

Vertiv notes that extended energy autonomy is emerging as a strategic priority, especially for AI-focused facilities. Investments in natural gas turbines and other on-site generation technologies are increasingly driven by grid constraints rather than backup requirements alone.

This shift is giving rise to strategies like “Bring Your Own Power (and Cooling)”, where operators design facilities around self-generated energy and tightly integrated thermal systems. While capital-intensive, these approaches offer more predictable scaling and faster time to deployment in power-constrained markets.

Energy autonomy also intersects with sustainability goals, forcing operators to balance capacity expansion with emissions considerations and long-term regulatory risk.

Digital twins move from planning tool to necessity

As AI infrastructure grows more complex, traditional design and deployment processes are struggling to keep up. Vertiv’s report highlights digital twin technology as a critical enabler for speed and scale.

By using AI-driven digital twins, operators can virtually model entire data centers—including IT, power, and cooling systems—before anything is built. These virtual environments allow teams to validate designs, optimize layouts, and integrate prefabricated modular components.

The payoff is speed. Vertiv estimates that digital twin-driven approaches can reduce time-to-token by up to 50%, a metric that matters deeply in competitive AI markets. Faster deployment means faster access to compute, which can translate directly into business advantage.

Digital twins also support the concept of the data center as a unit of compute, reinforcing tighter integration between physical infrastructure and AI workloads.

Adaptive, resilient liquid cooling

Liquid cooling has rapidly moved from niche to necessity as AI workloads push beyond the limits of air cooling. But Vertiv argues that cooling innovation isn’t stopping at adoption—it’s becoming smarter.

AI itself is now being applied to optimize liquid cooling systems, using advanced monitoring and control to predict failures, manage fluid dynamics, and improve overall resilience. In high-value AI environments, where downtime can be extraordinarily expensive, predictive cooling intelligence could significantly boost uptime and hardware longevity.

As liquid cooling becomes mission-critical, adaptive systems that learn and respond in real time may become a standard expectation rather than a premium feature.

Why this matters beyond data centers

While the Frontiers report is focused on infrastructure, its implications ripple outward into cloud strategy, AI economics, and even MarTech and AdTech ecosystems. AI-driven services—from personalization engines to real-time analytics—ultimately depend on the scalability and reliability of the underlying compute layer.

If AI factories struggle with power, cooling, or deployment speed, innovation at the application layer slows as well. Conversely, breakthroughs in infrastructure efficiency can lower costs and expand access to AI capabilities across industries.

Vertiv’s framing also underscores a broader industry truth: AI transformation isn’t just about models and software. It’s equally about electrons, heat, and physical space.

A data center industry in transition

Vertiv operates in more than 130 countries, spanning power management, thermal management, and IT infrastructure from the cloud to the edge. That breadth gives the company a wide-angle view of how infrastructure demands are changing—and how quickly legacy assumptions are being challenged.

The Frontiers report makes it clear that incremental upgrades won’t be enough. AI is forcing a rethinking of foundational design choices, from power architecture to cooling strategy to how facilities are conceptualized and delivered.

As Armul puts it, gigawatt-scale AI innovation depends on embracing these shifts. The data center, once a supporting actor, is now a central character in the AI story—and its evolution may determine how fast the next wave of AI progress arrives.

Get in touch with our MarTech Experts.

transcosmos and Priv Tech Launch Privacy Consulting to Unlock Data-Driven Marketing in a Post-Cookie Era

transcosmos and Priv Tech Launch Privacy Consulting to Unlock Data-Driven Marketing in a Post-Cookie Era

digital marketing 9 Jan 2026

As privacy regulations tighten and signal loss reshapes digital advertising, marketers face a growing paradox: they need more data to improve performance, but fewer ways to use it safely. transcosmos and Priv Tech, Inc. believe the solution lies at the intersection of privacy engineering and marketing execution.

The two companies have announced the joint launch of Privacy Consulting Services, set to roll out in December 2025. The new offering is designed to help companies leverage first-party data for digital marketing—particularly through Conversion APIs (CAPI)—while complying with complex privacy regulations in Japan and overseas.

The timing is deliberate. As platforms push server-side tracking and AI-driven optimization, outdated privacy policies have become a hidden bottleneck preventing marketers from fully activating their data.

Why privacy has become a growth constraint

For years, privacy compliance was treated as a legal checkbox. Today, it directly impacts marketing performance.

Regulations such as Japan’s Act on the Protection of Personal Information (APPI), the Telecommunications Business Act, GDPR, and CCPA have raised the bar for how companies collect, disclose, and activate user data. At the same time, advertising platforms increasingly rely on first-party conversion and event data—often transmitted via CAPI—to fuel AI learning and improve targeting accuracy.

The result is a growing gap. Many companies want to deploy CAPI to offset cookie loss and improve ROI, but their privacy policies, consent frameworks, and internal governance structures aren’t ready. Revising them has become slow, risky, and resource-intensive, especially for organizations operating across markets.

That’s the problem transcosmos and Priv Tech are aiming to solve.

A joint approach: privacy by design, performance by default

The new Privacy Consulting Services combine transcosmos’s strength in marketing execution and technology deployment with Priv Tech’s expertise in privacy protection and privacy-enhancing technologies.

Rather than treating privacy and performance as opposing forces, the partnership positions privacy as an enabler of modern digital marketing. By building compliant foundations first, companies can activate data with greater confidence and scale.

The service provides end-to-end support, spanning privacy strategy, policy revision, technology implementation, and marketing enablement. This approach is designed to help businesses move from regulatory uncertainty to operational readiness—without stalling their advertising initiatives.

What the service covers

At its core, the offering focuses on removing the friction that prevents companies from deploying CAPI and other data-driven marketing technologies.

Key service components include:

  • Support for revising privacy policies to align with new and evolving regulations

  • Compliance assistance for Japanese privacy laws, including APPI and the Telecommunications Business Act

  • Cookie policy development, including site scans and policy template creation

  • Consent management platform (CMP) deployment, ensuring proper user consent flows

  • Regulatory risk mitigation, reducing exposure to compliance failures and public backlash

By addressing both legal requirements and technical implementation, the service aims to shorten the path from compliance to activation.

Why CAPI is central to the strategy

Conversion APIs have become a critical infrastructure layer for digital advertising. As browser-based tracking degrades, server-side data transmission allows platforms like Meta and Google to receive higher-quality conversion signals, improving attribution and optimization.

But CAPI only works if companies can lawfully collect, process, and transmit user data—and clearly disclose those practices. Without compliant privacy policies and consent mechanisms, CAPI adoption can expose businesses to regulatory and reputational risk.

This is where the partnership’s positioning is notable. Instead of selling CAPI deployment in isolation, transcosmos and Priv Tech are framing it as part of a privacy-first data utilization strategy.

Expected outcomes for marketers

According to the companies, businesses that adopt the new service can expect several tangible benefits:

  • Stronger customer trust and brand value, driven by transparent data practices

  • More effective use of marketing data, improving targeting precision and ROI

  • Reduced privacy risk, including protection against compliance violations and online backlash

  • A competitive advantage, built on privacy as a differentiator rather than a constraint

In an environment where consumers are increasingly sensitive to how their data is used, these outcomes carry strategic weight beyond short-term performance metrics.

A broader signal for the MarTech market

The launch reflects a broader trend across MarTech and AdTech: privacy is no longer a downstream concern—it’s becoming a core capability.

As AI-driven marketing depends more heavily on high-quality first-party data, companies that can operationalize privacy at scale will move faster than those stuck in legal and technical gridlock. Consulting models that blend compliance, technology, and performance may become more common, particularly in markets like Japan where regulatory complexity is high.

For transcosmos, the partnership reinforces its role as a full-stack marketing and technology partner. For Priv Tech, it positions privacy engineering not just as risk management, but as a growth enabler.

Building sustainable growth through trust

transcosmos says it remains committed to helping businesses achieve sustainable growth by addressing real-world client challenges. In today’s digital marketing environment, few challenges are as pressing—or as intertwined—as privacy compliance and performance.

 

By launching Privacy Consulting Services together, transcosmos and Priv Tech are making a clear statement: the future of data-driven marketing belongs to companies that can balance trust, compliance, and AI-powered optimization—at the same time.

Get in touch with our MarTech Experts.

ThreatModeler Acquires IriusRisk to Create a Global Powerhouse in AI-Driven Threat Modeling

ThreatModeler Acquires IriusRisk to Create a Global Powerhouse in AI-Driven Threat Modeling

artificial intelligence 9 Jan 2026

Threat modeling has long been considered a “nice to have” in application security—valuable in theory, but hard to scale in practice. That’s changing fast. As AI accelerates software development and expands the attack surface, enterprises are being forced to rethink how security is embedded from day one.

Against that backdrop, ThreatModeler has announced the acquisition of IriusRisk, bringing together what the companies describe as the two leading enterprise threat modeling platforms. The deal positions ThreatModeler as a dominant force in a rapidly expanding $30 billion application security market, with ambitions to make secure-by-design practices continuous, scalable, and deeply integrated into modern development lifecycles.

The move is as much about timing as it is about technology.

Why this acquisition matters now

Enterprise security teams are under pressure from both sides. On one end, development velocity is increasing, driven by cloud-native architectures, microservices, and AI-assisted coding. On the other, cyber threats are becoming more automated, targeted, and sophisticated.

Threat modeling sits at the intersection of those forces. It helps organizations identify design-level risks before code is written or deployed—but only if it can be applied consistently and at scale. Historically, that’s been the challenge.

By acquiring IriusRisk, ThreatModeler is betting that consolidation, automation, and AI-native intelligence are the keys to unlocking threat modeling’s next phase.

“With the addition of IriusRisk, we’re building the global leader in the threat modeling market to meet rapidly expanding demand,” said Matt Jones, CEO of ThreatModeler. “Together, we deliver customers greater innovation, expanded support, and more scalable solutions that make secure-by-design a sustainable, continuous practice at enterprise scale.”

Two leaders, complementary strengths

While both companies operate in the same category, their strengths have historically been complementary rather than redundant.

ThreatModeler is known for its AI-driven threat modeling platform, designed to help security architects rapidly model threats across complex, enterprise-scale environments. Its focus has been on speed, automation, and consistency—critical for organizations managing hundreds or thousands of applications.

IriusRisk, by contrast, has built deep traction with development and architecture teams, emphasizing collaboration, education, and adoption. Over time, that approach has helped foster what is widely regarded as the industry’s most active professional threat modeling community.

Bringing these two approaches together creates a platform that spans both sides of the security equation: architectural rigor at the enterprise level and practical engagement at the developer level.

According to the companies, customers using the combined capabilities have already seen measurable gains, including building threat models twice as fast and scaling adoption by more than tenfold.

Scaling “secure by design” beyond theory

One of the most striking claims around the acquisition is its focus on democratization. Threat modeling has traditionally been the domain of specialized security experts—a bottleneck in organizations trying to move faster.

The combined ThreatModeler–IriusRisk organization says it is uniquely positioned to change that dynamic. With hundreds of customers, tens of thousands of threat models built, and the largest professional threat modeling communities, the goal is to make secure-by-design practices accessible across entire enterprises.

That matters because most breaches aren’t caused by obscure zero-days. They’re the result of architectural oversights, misconfigurations, and design decisions made early—and rarely revisited.

By embedding threat modeling across the software lifecycle, the combined platform aims to help enterprises “virtually scale” their security teams, applying expert-level analysis without requiring expert-level headcount.

The AI angle: data, intelligence, and speed

AI is a recurring theme in the deal, and not just as a buzzword.

ThreatModeler emphasizes that the acquisition accelerates its vision of an AI-native security platform, powered by what it calls the industry’s largest proprietary threat modeling dataset. That dataset—now expanded with IriusRisk’s models, patterns, and community insights—forms the foundation for deeper intelligence and more automated decision-making.

“This milestone accelerates our vision to protect customers with an AI-native platform powered by the industry’s largest proprietary dataset,” said Archie Agarwal, Founder and Chief Innovation Officer of ThreatModeler. “By combining our teams and technology, we’re enabling faster innovation, deeper intelligence, and a security partner built to scale with our customers.”

In practical terms, this means more automated threat identification, smarter recommendations, and less reliance on manual expertise—all critical as AI both empowers developers and lowers the barrier for attackers.

A market ripe for consolidation

The threat modeling space has historically been fragmented, with a mix of open-source tools, consultancy-led approaches, and niche platforms. That fragmentation made it difficult for large enterprises to standardize practices globally.

This acquisition signals a shift toward consolidation, mirroring what has already happened in adjacent security markets such as application security testing and cloud security posture management.

Investors appear to agree. The combined company is majority owned by Invictus Growth Partners, with Paladin Capital Group, a long-standing investor in IriusRisk, remaining a shareholder. That continuity suggests confidence in the long-term growth of threat modeling as a core security discipline.

“Cybersecurity is a nonstop arms race, now accelerated by AI,” said John DeLoche, Co-Founder and Managing Partner at Invictus Growth Partners. “Threat modeling is essential for teams that want to proactively protect enterprise systems and applications. This acquisition unites leading threat-modeling expertise and creates the industry’s largest dataset, giving enterprises a decisive advantage in the AI era.”

Implications for enterprise security teams

For CISOs and application security leaders, the deal highlights a broader trend: design-time security is becoming non-negotiable.

As regulatory pressure increases and software supply chains grow more complex, organizations are being judged not just on how they respond to incidents, but on how well they prevent them. Threat modeling, once relegated to periodic reviews, is increasingly expected to run continuously alongside development.

By combining AI-driven automation with deep community adoption, ThreatModeler and IriusRisk are positioning themselves as a foundational layer in that shift.

Competitors will likely feel the pressure. Smaller vendors may struggle to match the scale, dataset depth, and enterprise reach of the combined platform, while larger security suites may look to strengthen their own design-time security capabilities through partnerships or acquisitions.

What comes next

While financial terms were not disclosed, the strategic intent is clear. ThreatModeler isn’t just expanding its footprint—it’s attempting to define what enterprise threat modeling looks like in an AI-first world.

“This is an exciting leap forward for the industry,” said Stephen de Vries, CEO of IriusRisk. “Both our companies share a passion for helping enterprises start left with their secure-by-design approach. By joining forces, we are better positioned to deliver on that shared mission.”

If successful, the acquisition could mark a turning point for threat modeling—from a specialist discipline practiced by a few, to an automated, AI-augmented capability embedded across every application and infrastructure layer.

In a security landscape where speed and foresight increasingly matter more than reaction, that shift could prove decisive.

Get in touch with our MarTech Experts.

Microsoft Unveils Agentic AI Suite to Power End-to-End Retail Automation

Microsoft Unveils Agentic AI Suite to Power End-to-End Retail Automation

artificial intelligence 9 Jan 2026

Microsoft has announced a new suite of agentic AI solutions aimed at bringing intelligent automation across the entire retail value chain—from merchandising and marketing to fulfillment and store operations.

Designed to help retailers move faster and operate with greater precision, the new capabilities introduce a connected layer of intelligence that replaces fragmented workflows with coordinated, context-aware execution. Microsoft positions the offering as a foundation for a unified, intelligence-driven retail operating model built for speed, relevance, and resilience.

“The retailers that thrive will be the ones that unify their business with intelligence that reaches every corner of the value chain,” said Kathleen Mitford, Corporate Vice President of Global Industry at Microsoft. “With Microsoft’s agentic AI, retailers can automate what slows them down and amplify what sets them apart.”

Turning Conversations Into Conversions With Copilot Checkout

A central pillar of Microsoft’s retail push is Copilot Checkout, a new capability that allows shoppers to complete purchases directly within Copilot conversations—without being redirected to external websites.

The launch comes as AI-driven ecommerce traffic continues to surge. Adobe reports that AI-powered ecommerce visits during the 2025 holiday season increased 693% year over year, underscoring the growing importance of frictionless, intent-driven shopping experiences.

Copilot Checkout is now live in the U.S. on Copilot.com, with support from partners including PayPal, Shopify, and Stripe. Early participating brands include Urban Outfitters, Anthropologie, Ashley Furniture, and Etsy sellers.

For Shopify merchants, Copilot Checkout will be enabled by default following an opt-out period, allowing retailers to preserve their checkout experience while meeting customers inside AI-powered discovery flows.

Personalized Shopping Through Brand and Commerce Agents

Microsoft is also introducing Brand Agents for Shopify merchants and a personalized shopping agent template in Copilot Studio. These tools allow retailers to deploy conversational shopping experiences trained on their product catalogs and brand voice.

Brand Agents provide a turnkey option for answering product questions and guiding shoppers, while the customizable shopping agent template enables advanced experiences such as real-time recommendations, outfit building, and cross-channel discovery across web, mobile, and in-store environments.

Retailers including Kappahl Group are already exploring these tools to improve conversion rates and reduce returns by helping shoppers make more confident purchase decisions.

Catalog Enrichment and Smarter Product Discovery

To support discovery and personalization at scale, Microsoft is launching a catalog enrichment agent template in public preview. The agent automatically extracts product attributes from images, enriches listings with social and contextual insights, and streamlines onboarding, categorization, and error resolution.

Brands like Guess see catalog enrichment as a foundational layer for delivering real-time recommendations and cohesive shopping journeys across channels.

AI Agents for Store Operations and Frontline Staff

Microsoft is extending agentic AI beyond digital commerce into physical retail operations. The new store operations agent template, now in public preview, provides store associates and managers with natural-language access to inventory data, policies, and operational insights.

By combining internal data—such as sales trends and foot traffic—with external signals like weather and local events, the agent recommends staffing adjustments, flags exceptions, and suggests next-best actions in real time.

Retailers such as Strandbags are using the solution to empower frontline teams while improving decision-making speed and consistency.

A Broader Shift Toward Agentic Retail

With this launch, Microsoft is signaling a broader shift toward agentic AI as the backbone of modern retail operations. By automating routine workflows across commerce, catalog management, and store operations, retailers can redirect resources toward strategy, innovation, and customer experience.

 

Microsoft says its approach combines deep enterprise integration with responsible AI development—positioning intelligent agents not just as tools, but as long-term operational partners for retailers navigating an increasingly competitive and AI-driven market.

Get in touch with our MarTech Experts.

Snowflake to Acquire Observe to Deliver AI-Powered Observability at Enterprise Scale

Snowflake to Acquire Observe to Deliver AI-Powered Observability at Enterprise Scale

artificial intelligence 9 Jan 2026

Snowflake has signed a definitive agreement to acquire Observe, an AI-powered observability platform built natively on Snowflake, as the data cloud provider deepens its push into enterprise-scale reliability, operations, and AI-driven application management.

The acquisition positions Snowflake to deliver a new generation of AI-powered observability, designed for the scale, complexity, and economics of modern AI-native enterprises operating across distributed systems, autonomous agents, and data-intensive applications.

“As our customers build increasingly complex AI agents and data applications, reliability is no longer just an IT metric—it’s a business imperative,” said Sridhar Ramaswamy, CEO of Snowflake. “By bringing Observe’s capabilities directly into the Snowflake AI Data Cloud, we’re enabling enterprise-wide observability with open architecture and AI-powered troubleshooting at massive scale.”

From Monitoring to Agentic, AI-Driven Troubleshooting

Observe was built on Snowflake from its inception, making the integration a natural extension of the Snowflake AI Data Cloud. Together, the companies aim to move observability beyond reactive monitoring toward proactive, automated operations.

At the core of the offering is Observe’s AI-powered Site Reliability Engineer (SRE), which correlates logs, metrics, and traces through a unified context graph. By combining this intelligence with Snowflake’s trusted enterprise data, teams can detect anomalies earlier, identify root causes faster, and resolve production issues up to 10 times faster, according to the companies.

This agentic approach is increasingly critical as enterprise systems become more distributed, autonomous, and AI-driven.

Open Standards and Telemetry at Scale

The acquisition also establishes a unified, open-standard observability architecture based on Apache Iceberg and OpenTelemetry, both of which Snowflake actively contributes to.

By treating telemetry as first-class data within the Snowflake AI Data Cloud, enterprises can manage terabytes to petabytes of logs, metrics, and traces using economical object storage and elastic compute. This approach allows observability data to be analyzed alongside business and operational data with consistent governance, analytics, and AI models.

Industry analysts see this as part of a broader shift away from specialized observability stacks toward lakehouse-style economics.

“Observability’s cost problem stems from treating telemetry as special-purpose data,” said Sanjeev Mohan, Principal Analyst at SanjMo. “Snowflake’s acquisition highlights how the lines between data platforms and observability platforms are blurring.”

Eliminating the Cost-Retention Tradeoff

As AI-driven applications generate unprecedented telemetry volumes, many organizations have been forced to rely on data sampling or short retention windows to control costs. Snowflake and Observe say their combined platform removes that tradeoff.

By unifying Observe’s AI-driven observability with Snowflake’s scalable data foundation, enterprises can retain high-fidelity telemetry data for longer periods while reducing overall observability costs—improving visibility across their entire data estate without sacrificing economics.

“Observability is fundamentally a data problem,” said Jeremy Burton, CEO of Observe. “By combining our AI-powered SRE with Snowflake’s AI Data Cloud, we can deliver faster insights, greater reliability, and dramatically better economics for operating the next generation of AI applications.”

Expanding Snowflake’s IT Operations Footprint

Beyond product integration, the acquisition expands Snowflake’s presence in the fast-growing IT operations management (ITOM) market. Gartner estimates the ITOM software market reached $51.7 billion in 2024, growing 9% year over year.

Following the close of the transaction—subject to regulatory approvals—Snowflake plans to deepen its focus on helping enterprises operate reliable AI agents and applications at scale. Observe’s developer-friendly approach is expected to complement Snowflake’s existing workload engines with real-time enterprise context, faster root-cause analysis, and AI-assisted troubleshooting.

 

Together, Snowflake and Observe aim to redefine observability as a core capability of the modern data platform—one built for AI-native systems where reliability, cost efficiency, and speed are inseparable.

Get in touch with our MarTech Experts.

Amperity Unveils Customer Data Agent to Turn AI Insights Into Marketing Action

Amperity Unveils Customer Data Agent to Turn AI Insights Into Marketing Action

artificial intelligence 9 Jan 2026

Amperity is taking aim at one of marketing’s most persistent problems: the gap between insight and action.

The AI-powered Customer Data Cloud company has introduced its Customer Data Agent, positioning it as the first enterprise AI agent designed not just to analyze customer data, but to act on it. Built on unified, trusted customer profiles, the agent allows marketers to move from questions to live segments and journeys in hours instead of weeks.

In an era where AI tools are plentiful but impact is uneven, Amperity’s message is clear: AI only works in marketing when it’s deeply connected to customer data and embedded directly into daily workflows.

From AI Insight to Activation—Without the Wait

For many marketing teams, access to customer insights remains a bottleneck. Data lives across systems, insights require specialized queries, and activation often depends on engineering backlogs. The result is slow decision-making and missed opportunities.

The Customer Data Agent is designed to eliminate that friction. Using conversational, natural-language prompts, marketers can explore unified customer data, generate insights, and immediately turn those insights into activation-ready segments and journeys—without translating questions into SQL, tickets, or dashboards.

“This marks an important step forward for the category,” said Tapan Patel, Research Director at IDC. “It shows how AI can move from simply providing customer insights to actually helping marketers decide what to do next.”

A Different Model for Enterprise AI

Unlike generic AI assistants layered onto fragmented datasets, Amperity’s Customer Data Agent is built on the company’s patented identity resolution technology, giving it a complete and accurate view of the customer across channels and touchpoints.

That foundation enables a different kind of enterprise AI—one that can be trusted to support decisions, not just surface patterns.

With this release, the Customer Data Agent:

  • Operates on unified, real-time customer profiles, not system-level records

  • Orchestrates specialized AI agents for segmentation, journey design, and analytics

  • Compresses the timeline from insight to revenue impact from weeks to hours

This approach reflects a broader industry shift: AI is no longer judged by how impressive its outputs look, but by how effectively those outputs translate into measurable business results.

AI That Matches How Marketers Actually Work

Rather than forcing teams to adapt to new technical interfaces, the Customer Data Agent is designed to work the way marketers already think and communicate.

Teams can ask plain-language questions such as:

  • “Build a segment of high-value customers likely to repurchase this quarter.”

  • “Design a journey for first-time buyers with declining engagement.”

  • “Show which customer groups are driving the most incremental revenue.”

The agent delivers answers and can route them directly into activation, measurement, or optimization workflows—closing the loop between understanding and execution.

“Most AI value gets stuck between the model and the workflow,” said Derek Slager, Co-Founder and CTO at Amperity. “The Customer Data Agent closes that gap by delivering outputs that are immediately usable.”

Why This Matters Now

AI has become central to enterprise marketing strategies, but many organizations still struggle to operationalize it. The challenge is rarely model performance—it’s the lack of clean, unified data and the operational layers needed to act on AI outputs.

Amperity’s Customer Data Agent brings those pieces together: trusted data, applied intelligence, and built-in activation. For consumer brands under pressure to personalize faster, improve ROI, and reduce dependency on technical teams, that combination could be a meaningful competitive advantage.

As AI continues to move from experimentation to expectation, tools that directly connect intelligence to execution may define the next phase of marketing technology.

Get in touch with our MarTech Experts.

Domaine Acquires Pattern to Bet Big on Experience-Led Shopify Commerce

Domaine Acquires Pattern to Bet Big on Experience-Led Shopify Commerce

business 9 Jan 2026

Domaine Worldwide is doubling down on experience as the next battleground in ecommerce.

The global Shopify design and development specialist has acquired Pattern, one of the most decorated digital design agencies in the commerce ecosystem, in a move that underscores a broader shift in digital commerce: design is no longer a finishing touch—it’s a growth lever.

The acquisition brings together Pattern’s brand-led, experience-first design capabilities with Domaine’s Shopify engineering, platform migration, and global delivery scale. The result is a combined offering aimed squarely at brands that see digital experience as a competitive differentiator, not a cosmetic upgrade.

Why This Deal Matters

Ecommerce has matured. Performance marketing is more expensive, platform features are increasingly commoditized, and AI-driven personalization is raising customer expectations. In that environment, brands are competing less on who has the fastest site—and more on who delivers the most cohesive, emotionally resonant experience across channels.

That’s where this acquisition lands.

“Experience design has become a defining competitive advantage for commerce brands,” said Marko Bon, President of Domaine. Bringing Pattern into the fold, he noted, expands Domaine’s ability to help clients reimagine their brands while delivering scalable, technology-driven experiences.

In practical terms, Domaine is positioning itself not just as a Shopify systems integrator, but as a full-spectrum commerce partner—from brand identity and UX to platform engineering and AI-ready infrastructure.

Pattern’s Role: Design at Critical Moments

Pattern has built its reputation by partnering with brands at inflection points: rebrands, digital redesigns, and major platform migrations where the stakes—and expectations—are highest.

Its client roster includes digitally native and enterprise brands such as Marine Layer, Framebridge, Bobbie, Image Skincare, and Rothy’s, and its work has earned more than 200 industry awards, including repeated recognition from the Webbys, W3, Glossy, and ADWEEK.

Notably, Pattern and Domaine have already worked together on high-profile projects, including Rothy’s—making this acquisition more of a formalization than a leap into the unknown.

A Unified, Experience-First Commerce Stack

The combined company is targeting brands across the growth spectrum:

  • VC-backed startups building digital-first identities

  • High-growth brands evolving their ecommerce experience

  • Enterprises needing scalable design systems, complex workflows, and AI-enabled efficiencies

Pattern brings deep expertise in brand systems, UX, and storytelling. Domaine adds global delivery, Shopify Plus engineering, and operational scale. Together, they aim to offer something many agencies struggle to balance: craft and scale.

“As Pattern experienced record-breaking growth, finding the right partner to scale with became critical,” said Isaac Newton, Co-Founder of Pattern. Cultural alignment and shared ambition, he added, made Domaine a natural fit.

Shopify, AI, and the Experience Arms Race

The timing of the deal is telling. Shopify is increasingly positioning itself as an operating system for modern commerce, with AI-driven features, composable architectures, and personalization tools baked into the platform.

But technology alone doesn’t differentiate brands.

As AI accelerates content creation and personalization, experience design becomes the filter that determines whether those capabilities feel coherent—or chaotic—to customers. Domaine and Pattern’s shared “one site per customer” philosophy reflects this shift toward deeply personalized, brand-consistent experiences powered by data and automation.

This acquisition also mirrors a broader trend in the MarTech and commerce services market: agencies are consolidating to meet rising client demands for end-to-end execution, rather than piecemeal design or development work.

Competitive Implications

Domaine’s move raises the bar for Shopify-focused agencies, many of which specialize either in design or engineering, but not both at enterprise scale. It also puts pressure on consultancies and systems integrators that have historically deprioritized brand and experience in favor of implementation speed.

For brands, the message is clear: digital commerce partners are expected to deliver strategy, experience, technology, and scalability—all under one roof.

What Comes Next

With Pattern fully integrated, Domaine plans to expand its global footprint and deepen its role in shaping how commerce brands adapt to rapid change across AI, personalization, and platform evolution.

For clients, the promise is simpler: fewer handoffs, tighter alignment between brand and technology, and digital experiences designed to scale—not break—as expectations rise.

 

In a crowded ecommerce landscape, Domaine is betting that experience-first commerce isn’t just a design philosophy. It’s the next phase of competitive advantage.

Get in touch with our MarTech Experts.

   

Page 88 of 1467

REQUEST PROPOSAL