Martech Edge | Best News on Marketing and Technology
GFG image

marketing

Inside the Programmatic Audio Boom with Jen Oon, SVP of DAX US

Inside the Programmatic Audio Boom with Jen Oon, SVP of DAX US

marketing 12 Mar 2026

Q1. Give us a short introduction about who you are and how you got into programmatic advertising
I’m Jennifer Louie Oon, Senior Vice President of Sales at DAX United States, where I focus on helping brands and publishers unlock the value of programmatic audio. My career didn’t follow a straight line. I’ve always prioritized learning and being curious about how media connects with people, which naturally led me into programmatic and the evolving opportunities in audio advertising. 
 
Q2. Why is programmatic audio finally operating like a scaled, reliable channel?
Programmatic audio has matured because the ecosystem is finally building the infrastructure and standards that make scale and reliability possible. Better targeting, improved measurement, and more sophisticated tools have helped reduce friction for buyers and sellers alike. As audio’s measurement capabilities improve, the channel is starting to deliver at a level brands can confidently invest in. The kind of consistency that used to exist only in video and display is now becoming real in audio.
 
Q3. How do you think buyers and sellers are navigating fragmentation across audio environments?
Fragmentation, from podcasts to ad-supported streaming and local audio, is both a challenge and an opportunity. Buyers are recognizing that focusing only on a handful of major platforms leaves gaps in reach and audience coverage. Sellers are looking to technology to standardize access and measurement across those environments. By embracing programmatic solutions and expanding how audio inventory is packaged and activated, the industry can actually turn fragmentation into reach and precision rather than confusion.
 
Q4. What do you believe advertisers are getting wrong about audio performance and measurement?
A lot of advertisers still think about audio the way they did years ago, as a support channel that’s hard to measure. That perception comes from outdated measurement models that haven’t kept up with how people actually consume audio. Audio delivers attention and engagement, but without reliable measurement, its performance hasn’t been properly credited. Once measurement systems catch up and show audio’s true impact, especially outside the biggest platforms, spend will shift to better reflect how audiences are actually listening.
 
Q5. How can DAX help brands navigate a fragmented audio market and drive measurable results through programmatic audio?
DAX helps brands and publishers by reducing the complexity of a fragmented market. We bring premium ad-supported audio inventory into a programmatic environment where buyers can efficiently reach audiences at scale and with transparency. We also focus on driving better measurement and insights so advertisers can understand what’s working and optimize in real time. By combining reach, reliability, and clearer performance data, DAX enables brands to invest confidently in audio as a core part of their media mix.
The New Marketing Stack: Where PR Fits in the AI-Driven Martech Ecosystem.

The New Marketing Stack: Where PR Fits in the AI-Driven Martech Ecosystem.

marketing 11 Mar 2026

By Pilar Lewis, Senior Public Relations Associate at Marketri 


For most of my career, PR has been viewed as something that supports marketing. Now it plays a much more central role.


As generative tools like ChatGPT, Google AI Overviews, Perplexity, and Claude become a default layer of discovery, brand visibility is shifting from “Can you find us?” to “Will the AI cite us?” That change elevates PR from a communications function to a foundational input into how AI interprets authority, credibility, and relevance.


Discovery is moving from search results to AI answers


Traditional search rewarded the best mix of technical SEO, content depth, and domain authority. Generative discovery changes the interface and the outcome. Buyers increasingly get a synthesized answer, not a list of links, and the brands that appear in the citations become the brands that feel “known.”


This is why the martech stack conversation needs an update. It's no longer enough to orchestrate journeys and optimize conversion paths if the brand never enters the consideration set because it’s not showing up in AI-generated answers.


AI systems trust third-party validation more than brand messaging


AI answer engines don't treat brand-owned content and third-party coverage as equals. In practice, credible reporting, expert commentary, and industry publications tend to carry more weight than promotional content because they look like independent validation.


PR is the system that produces that validation at scale. Earned media creates durable signals about what a company does, what it’s known for, and who’s willing to quote it. Over time, those signals form “entity memory,” a pattern of consistent, verifiable references that increases AI confidence and makes citation more likely.


PR is becoming a core layer of the modern marketing stack


The modern stack has been built around four pillars: data, automation, content, and measurement. PR now belongs in that set because it supplies a unique input: third-party authority.


Here's an easy way to think about it:


1.      SEO and content help you publish what you want to be known for. 

2.      PR helps others validate it publicly, in places AI systems trust. 

3.      Analytics and attribution help you see how visibility turns into demand. 

4.      Automation helps you act on that demand.

The industry is already moving toward systems that act, not just report. If marketing is becoming more autonomous, the inputs to those systems matter more. PR is one of the inputs that shapes the “truth set” AI pulls from when they speak on your behalf.


PR and GEO work better together than either does alone


Generative Engine Optimization (GEO) is the discipline of structuring and publishing content so AI systems can extract it, understand it, and cite it. PR strengthens GEO because earned media adds validation.


The practical connection is straightforward:


·       PR builds authority signals through credible mentions, quotes, and bylined expertise. 

·       Structured content builds clarity through definitions, proof points, and consistent terminology. 

·       AI systems synthesize both and reward what looks well-supported across multiple sources.

This is also where the martech stack is heading. Gartner projects that by 2028, 33% of enterprise software applications will include agentic AI that can enable autonomous decisions in marketing. As those systems make more choices automatically, brands will need to engineer for how they are interpreted, not just how they rank.


Real-time PR is becoming an operating model


AI-driven discovery moves at the speed of the news cycle. The brand that shows up early and consistently in credible coverage does not just win attention in the moment, but it builds durable entity memory that compounds.


And the way AI systems cite sources makes the case for why PR has to operate continuously, not periodically:


·       Earned media dominates what gets cited. According to Muck Rack, 89% of cited links are earned media, reinforcing that third-party validation is the “language” AI systems rely on when they answer questions. 

·       Recency matters more than most teams plan for. Half of all citations are published within the last 11 months, which means stale authority fades faster than traditional brand and SEO playbooks assume. 

·       The first week is the citation window that matters most. The first seven days after publication have the highest citation rate, so the speed of distribution, amplification, and follow-on placements directly influences what AI tools ingest and repeat.

Basically, PR isn’t only about placement now. It’s about managing freshness, consistency, and corroboration across trusted outlets so the brand remains present in the source layer AI systems reference. The outcome is visibility with credibility, delivered at the pace buyers and AI now expect.


The next marketing stack focuses on authority not volume 


Marketing leaders already know how to invest in content and automation. Now it’s learning to invest in authority signals with the same intent because visibility depends less on how much content a brand produces and more on whether credible third-party sources consistently validate the brand. 


That validation is exactly what PR produces, which is why it belongs inside the modern marketing stack alongside data, automation, content, and measurement. It creates the external signals that help AI systems interpret credibility and decide which sources are trustworthy enough to cite.


As AI is increasingly shaping how people discover and evaluate companies, then credibility across trusted sources matters more than ever, and PR is a major part of how brands earn it.
Bear Cognition Eliminates Operational Inefficiencies Through Scalable Intelligent Automation

Bear Cognition Eliminates Operational Inefficiencies Through Scalable Intelligent Automation

marketing 5 Mar 2026

The gap between AI experimentation and scale highlights a critical issue—companies lack structured workflow integration, real-time optimization, and intelligent automation frameworks to translate AI into sustained competitive advantage.

As businesses accelerate digital transformation, the real differentiator is no longer experimenting with AI but embedding intelligent process optimization directly into core operations to drive measurable ROI.


(Q) Why do organizations struggle to translate AI experimentation into enterprise-wide operational impact, and what structural barriers prevent AI from moving beyond isolated business functions into core workflows?

Most companies don’t struggle with trying AI. They struggle with making it matter. It’s easy to run a pilot. It’s much harder to embed AI into the way the business actually operates day to day. And too often you see an incongruency in implementation of AI tools, functions, and plans for adoption between departments and teams. That’s where things stall.

If AI doesn’t connect into the core workflows, it becomes interesting but not transformational. And if it’s not tied to measurable outcomes like margin expansion, labor savings, or faster cycle times, it never becomes anything more than marginally impactful tools for an organization. 

At Bear Cognition, Our SwaS® (Software with a Service) model is built to avoid that trap. We embed intelligence directly into operational workflows and stay engaged to ensure performance improves over time. That’s when AI stops being experimental and starts being operational.


(Q) How does the absence of intelligent workflow design limit the true potential of AI adoption?

You can have great models and still have inefficient operations.

A common example is Intelligent Document Processing. Basic OCR pulls data off a document but then what? If someone still has to validate it, re-key it, route it, or make decisions manually, you haven’t really changed the workflow.

True IDP goes further. It extracts, validates, classifies, and routes information automatically, so the process actually moves forward without human bottlenecks.

At Bear Cognition, we don’t just focus on the model. We design the full system around it; ingestion, decision logic, automation triggers, and feedback loops.

The limiting factor usually isn’t the AI. It’s the way the workflow is structured around it.


(Q) What role does real-time performance monitoring play in scaling intelligent automation successfully?

If you’re going to automate core operations, you need visibility into how that automation is performing. Real-time monitoring ensures that outputs stay aligned with business objectives, especially when margins are tight. It allows you to catch drift early, correct exceptions quickly, and continuously improve the model based on actual outcomes.

For example, in logistics, pricing decisions can’t be static. Our Revenue Optimization System tracks win rates and margin performance and adjusts accordingly. Other tools within our Constellation One logistics suite, automation agents operate together and are monitored in real time to prevent missed bids or revenue leakage.

At scale, automation has to evolve with the business. Monitoring is what makes that possible.


(Q) How do intelligent agents improve multi-step operational processes without increasing operational risk?

I feel there’s a misconception that automation inherently introduces risk. In actuality it’s poorly designed automation that brings that into play. Thoughtful automation actually reduces it.

Agents should be built to integrate into existing systems rather than replace them abruptly. They operate within defined parameters, include structured exception handling, and maintain auditability. The result is more consistency, fewer errors, and less dependency on manual processes, while keeping human oversight where it belongs.

Automation shouldn’t remove control. It should strengthen it.


(Q) What differentiates Bear Cognition’s Software with a Service (SwaS®) model from traditional AI software deployments?

Traditional SaaS vendors deploy software and move on. We stay engaged.

SwaS® combines technology with ongoing implementation, optimization, and governance. Our Data Lab designs and continuously runs these systems alongside our clients. That matters because AI isn’t static. It needs calibration, refinement, and alignment with evolving business goals.

In an industry that can seem devoid of human interaction, discussion, and feedback, we make it a cornerstone of how we work with clients. Bear Cognition doesn’t just ship software but delivers true performance for organizations. 


(Q) How does Bear Cognition enable continuous learning within its AI systems to ensure long-term performance optimization while ensuring its automation frameworks remain scalable as enterprises grow?

Continuous learning isn’t something we bolt on after the fact. It’s something we build into the architecture. Our systems incorporate feedback loops that track transaction outcomes and refine models over time. For example, our AI-enhanced IDP improves accuracy through correction-based learning.

Scalability comes from modular design. With Constellation One, organizations can deploy specific automation agents or orchestrate multiple agents together as complexity increases. Cloud-native infrastructure and performance governance ensure the system grows alongside the enterprise.

Ultimately, we focus on something simple: raising operational IQ. Intelligence that adapts over time is what allows companies to scale confidently.

Why Enterprises Are Reassessing the Marketing Suite Model

Why Enterprises Are Reassessing the Marketing Suite Model

marketing 5 Mar 2026

Q1: You’ve worked inside Adobe and Salesforce ecosystems. What actually feels different right now?


Enterprise frustration with big vendors isn’t new. What feels different is how consistent it is, and how it’s coming from everywhere. In conversations with 52 enterprise teams, 94% shared real frustration with their marketing suite. That’s not scattered grumbling. That’s a clear pattern. And 79% pointed to the same three issues: complexity, cost, and vendor lock-in.


I’ve spent years working with the Adobe and Salesforce stacks. I know how they’re sold, how they’re justified internally, and how they’re defended. Historically, leaders have complained about implementation issues or specific features. Today, many are questioning whether the suite model still delivers value that justifies its cost and weight. As technology changes, they’re concerned that marketing suites will fall behind. That’s a shift.


Q2: What is “Suite Fatigue” in practical terms?


Suite Fatigue is what happens when the system still works, but demands so much effort that it stops feeling worth it. Marketing suites were originally sold as simplifiers: one platform, fewer vendors, shared data, faster execution. In reality, many teams feel slowed down.


In the research, complexity was the most common complaint, coming up 42 times out of 52. Leaders described long implementation timelines, low user adoption, the need for advanced suite-specific specialists, and workflows that required technical intervention just to run everyday campaigns. High cost came up 29 times, and vendor lock-in 22 times. When complexity leads the list, it suggests the problem isn’t feature gaps. It’s operability. And over time, that becomes fatigue.
 


Q3: Is this mainly about price and lock-in, or is something more structural happening?


Price is the most visible issue, but it’s not just about license fees. Leaders point to renewal increases, paid support, consulting dependency, unexpected version migrations, and product add-ons that quietly expand total cost. Cost becomes a problem when flexibility declines, and operational burden rises. There is also the opportunity cost of what could be done with faster, more agile Martech vendors that competitors may be using.


Lock-in makes it worse. Many teams stay with marketing suites not because they’re satisfied, but because replacing them requires rebuilding data collection, identity, and core workflows. Switching friction becomes the reason for renewal. After years of marketing suite dependence, the suite vendor owns much of their Martech stack so they are stuck.


It’s also broader than price and lock-in. The research highlighted forced re-implementations, slow innovation, and gaps between supposedly integrated tools. Individually manageable, collectively structural. That’s why this feels like reassessment, not routine dissatisfaction.
 


Q4: What are the real business consequences of staying in a fatigued stack?


The most immediate consequence is a loss of velocity. When you need specialists to orchestrate campaigns, tickets to activate audiences, and engineering cycles just to test ideas, everything slows down. That’s not fringe cost. It’s competitive drag.


There’s also a talent impact. Instead of empowering marketers to operate independently, suites often push routine execution into data teams, consultants, or integration partners. That inflates the true cost of ownership and creates hard-to-remove bottlenecks.


In an AI-driven market, the risks grow. If your stack is tightly bundled, you’re stuck with whatever AI your suite vendor decides to produce. It becomes hard to test new models, try specialized tools, or add better decision systems. So your AI roadmap becomes entirely dependent on one vendor’s release cycle.


Over time, companies stop focusing on market opportunities and start working around the constraints of their platform, including the vendor’s AI limits.
 


Q5: Why is this surfacing more forcefully now?


Two trends are converging. First, leadership turnover is changing tolerance. New marketing leaders are less emotionally tied to legacy infrastructure decisions. They’re questioning why their organizations are locked into $10M or $20M annual commitments that are hard to unwind.


Second, the data architecture is shifting. Customer data increasingly lives in cloud data warehouses rather than inside proprietary stacks. Clean, unified data is now a prerequisite for AI and advanced analytics. As the warehouse becomes the center of gravity, the logic of tightly coupled suites makes less sense. When data and identity live outside the suite, leverage shifts back to the enterprise.


That’s not a collapse of the suite idea. It’s a reassessment of where control and value should sit.


 


Q: If Suite Fatigue is real, what structurally changes the dynamic?


The shift isn’t about swapping one vendor for another. It’s about moving where control lives. Historically, suites anchored data collection, identity, and orchestration inside their own walls. That centralization created strength, but it also created gravity. When the system of record sits inside the vendor, every price change or migration ripples across the entire stack.


When the foundation of data and identity sits in a neutral layer like the cloud warehouse, the suite becomes one layer among many, not the center. That changes leverage. You can replace activation tools without rebuilding the whole system. You can evaluate vendors on fit rather than switching cost. Complexity doesn’t disappear, but the locus of control shifts.


Organizations feeling marketing suite fatigue can begin to extricate themselves in a few ways:


●      Pick a few marketing suite applications that are not delivering value and switch to newer products, so you are diversifying your Martech stack

●      Replace marketing suite data collection with your own data collection, so you aren’t as reliant on the marketing suite vendor. Consider routing customer data to the warehouse first and then to the marketing suite, so you own your data and have the flexibility to re-route customer data to any Martech tool in the future


●      Backup marketing suite data in a cloud data warehouse and add a Composable CDp on top of the warehouse to move identity, audiences, and journeys upstream to the warehouse instead of having them be trapped in and owned by the marketing suite vendor


Suite Fatigue isn’t solved by emotion or revolt. It’s solved by architecture. When enterprises own the foundation of their data and identity, they regain flexibility and reduce dependency in a way that feels practical, not just strategic.

Supporting Data Points to Highlight:


●      94% of 52 enterprise teams reported some level of frustration with their marketing suite.


●      79% cited complexity, cost, or vendor lock-in as core challenges.


●      42 mentions of implementation and UI complexity, the most cited issue.


●      29 mentions of high costs, including renewal increases and services dependency.

 

The Modern Marketing Mess: How Bear Cognition’s SwaS® Model Optimizes Business Intelligence

The Modern Marketing Mess: How Bear Cognition’s SwaS® Model Optimizes Business Intelligence

marketing 2 Mar 2026

Global martech spending is projected to reach $215 billion by 2027, but value realization remains a concern.


As campaigns span dozens of disconnected platforms, decision-making is increasingly driven by guesswork rather than insight. 

With data fragmented across CRMs, ad platforms, and analytics tools, marketing leaders lack a unified view of spend, performance, and ROI—slowing strategy and inflating wasted budget.

Q) Despite increased investments in martech tools, why are marketing leaders still grappling with stack complexity and data integration challenges?

Because more tools don’t equal more intelligence.

Most marketing stacks evolved tool by tool: CRM here, ad platforms there, analytics layered on top. Each platform optimizes its own slice of performance, but none are designed to create a unified operational view.

We’ve seen marketing teams with 15+ tools in their stack—and still exporting CSVs every Friday to reconcile numbers before a leadership meeting. One client described it as “spending more time defending the data than discussing strategy.”

 The result:

 
  • Fragmented data across paid, owned, and earned channels
  • Conflicting metrics and reporting definitions
  • Manual reconciliation in spreadsheets
  • Lagging insight instead of real-time clarity
 

Marketing leaders are investing in capability, but without integration and orchestration, the stack becomes operationally complex instead of strategically intelligent.

 It’s not a tooling shortage.

It’s an integration and cognition gap.
 
(Q) How can AI move marketing teams from reactive performance tracking to proactive campaign optimization?

 

Traditional BI tells you what happened. By the time you see the report, the budget is already spent.

 
We’ve worked with teams who discovered a campaign was underperforming halfway through the month—not because the data wasn’t there, but because no system was actively monitoring for deviation. By the time someone noticed, six figures had already been allocated inefficiently. 

AI shifts marketing from observation to action by:

 
  • Continuously ingesting cross-channel data
  • Detecting anomalies and performance deviations in real time
  • Forecasting likely outcomes before spend is exhausted
  • Triggering automated optimization workflows
 Instead of waiting for a weekly report to explain underperformance, AI-driven systems surface risk signals early and recommend—or execute—corrective action.

 That’s the difference between tracking performance and engineering performance. 

(Q) What tangible benefits can marketing leaders gain by unifying data insights across multiple channels into a single source?

 When marketing data is unified into a true single source of truth, leaders gain:

 
  • Clear attribution across channels
  • Real-time visibility into ROI
  • Reduced reporting time and manual reconciliation
  • Faster decision cycles
  • More precise budget allocation
 But the real benefit is strategic confidence.

 In fragmented environments, meetings often start with, “Whose numbers are right?” In unified environments, meetings start with, “What are we doing next?”

 We’ve seen teams cut reporting time dramatically—not because they worked harder, but because they stopped rebuilding the same narrative every week. Once the foundation is trusted, attention shifts to optimization.
 

Unified intelligence eliminates guesswork. Leaders move from debating numbers to debating strategy. Decisions accelerate because the data foundation is stable.
 

That clarity drives measurable ROI and aligns directly with Bear Cognition’s core commitment: transforming inefficiencies into growth and measurable business impact.

 

(Q) How does Bear Cognition’s Software with a Service (SwaS®) model solve the execution gap that traditional BI and martech tools cannot address?
 
Traditional BI platforms stop at insight. They generate dashboards and reports—then leave execution to the team.

That’s the execution gap.

We’ve seen organizations invest heavily in analytics platforms only to discover that insight without workflow integration simply adds another dashboard to check. One marketing leader told us, “We have great visibility. We just don’t have the bandwidth to act on it consistently.”

 SwaS® closes that gap by combining:

 
  • AI-powered technology
  • Process automation
  • Embedded optimization workflows
  • Hands-on expertise
 

We don’t just surface insights. We operationalize them.
 

Our model integrates into existing systems, refines workflows, and continuously optimizes performance alongside the client. That hybrid approach ensures adoption, accountability, and ongoing improvement—not just analytics sitting unused. 

We don’t sell software.

We engineer intelligence that transforms decision-making into measurable execution. 

(Q) How does Bear Cognition empower agencies to scale intelligence, reporting, and optimization while maintaining their own workflows without overhauling existing systems?

Agencies don’t want another platform to manage. They want leverage.

We’ve spoken with agency operators juggling client dashboards across multiple ad platforms, manually compiling executive summaries late at night before board meetings. The friction isn’t creativity—it’s operational drag.

Bear Cognition integrates into existing agency ecosystems—ad platforms, CRM, analytics tools—without forcing a full-stack replacement. 

Through SwaS®, agencies gain:

 
  • Centralized, cross-platform intelligence
  • Automated reporting and data consolidation
  • AI-driven anomaly detection and forecasting
  • Continuous optimization support
 

All while preserving their internal workflows and client-facing processes.

We refine and optimize how teams already operate rather than forcing disruptive change. That’s core to our problem-first, tech-second philosophy.

he outcome is scalable intelligence. Agencies can serve more clients, with deeper insight and less manual overhead, without rebuilding their tech stack from scratch.

 

 The Back Office Revolution: How AI Is Reshaping Residential Brokerage Operations

The Back Office Revolution: How AI Is Reshaping Residential Brokerage Operations

marketing 2 Mar 2026

Responses by Eric Bramlett, broker-owner of Bramlett Partners

How is AI transforming brokerage operations without replacing real estate agents?


AI is changing the back office, not the front door. At our brokerage, we're using it to analyze calls, track lead follow-up quality, understand agent performance, and automate all the other operational stuff that was consuming hours and hours of our staff's time. None of those activities replaces the agent. It replaces the spreadsheet and the audit.

The truth is, being in real estate is a relationship-based business. 65% of all our business comes from our sphere of influence. Past clients, referrals, people who know us and trust us. An AI can't make a call to a past client to celebrate their birthday. It can't sit across from someone at a kitchen table and help them make the biggest financial decision of their life. What it can do is remind the agent to make that call, give them information to make that call more valuable, and free up time so that the agent can focus on the activities that actually drive their business.

 

The agents who should be concerned are those who weren’t doing much to begin with. AI is raising the floor for what constitutes a well-run operation, and those agents are finding fewer places to hide. However, for agents who are actually good at their jobs, AI is making those agents faster and better-supported, which is a far cry from replacing them.


Where does AI meaningfully improve efficiency in real estate workflows, and where does human judgment remain essential?

 

AI is very, very good at things like pattern recognition, data processing, and repetitive tasks. We utilize this for call analysis, listening to thousands of lead calls and determining their quality, follow-up compliance, and conversion rates. This process would require a human to listen to recordings for hours, whereas now we get this data in near-real time and with greater accuracy
 


We also utilize AI for market research, competitive analysis, and tool-building. We can go from determining a problem to having a working prototype in days thanks to AI-assisted coding, and this is a massive benefit for a smaller independent brokerage competing against organizations that have nine-figure tech budgets.


Where human judgment is still essential is where the stakes are high in terms of trust, subtlety, or negotiation. Figuring out the price of the house in an ever-changing market. Working with a client who is emotionally attached to the house. Dealing with someone who is being unreasonable in the negotiation. Reading the room in the listing presentation. These are high-context, high-stakes situations where the wrong decision can cost someone real money. AI can certainly guide the decision, but it cannot make it.


The brokerages that will win are the ones that understand where the line is in the right place, where the machine work is, where the human work is, and where the client really values the human work.
 

How is the decline of information asymmetry reshaping the real estate industry as consumers gain access to real-time listings, pricing models, and predictive analytics?


This has been going on for 25 years now. Zillow didn't kill real estate agents. Redfin didn't kill real estate agents. And neither will the next generation of AI-powered consumer tools kill real estate agents. What they will do is raise the bar on what a real estate agent must offer to a consumer.

When I first started out, real estate agents had access to information that consumers did not have access to. Well, that's no longer true. Then, real estate agents had access to market knowledge that consumers did not have access to. That's no longer true either. The information advantage is essentially dead, and good riddance to it. 


What hasn’t changed is that the consumer still wants someone they trust to assist them in an incredibly complex, high-stakes, emotional transaction. The data is available, but the judgment about what to do with it is not. Knowing the value of your home is between $450K and $510K is not the same as knowing how to sell it for $479K in this submarket this week in a way that generates multiple offers.

Agents who thought they were the gatekeepers of information are already gone or going. Agents who will thrive are the ones who offer interpretation, strategy, and advocacy. AI offers the consumer more data, and that’s only made the job of the good agent more important, not less.


How can brokerages adopt AI responsibly while preserving consumer trust?

 

So, the first question you need to ask yourself is, does this make the client experience better or worse? If you’re using this technology to make your agents respond faster, to make your agents better prepare their CMAs, to catch a compliance issue before it becomes a problem, that builds trust. If you’re using this technology to send general spam emails to a lead list, that destroys trust.


The biggest risk, I think, that brokerages face is that they’re going to start to use this technology to scale bad behavior. They’re going to start to send follow-up communications that feel robotic, they’re going to start to use chatbots as a replacement for human responsiveness, and they’re going to start to use predictive analytics to pressure consumers into making decisions. All of that is possible, and all of that is going to backfire.


The policy at our brokerage is straightforward. Technology does reminders and processes, while humans make the calls and build the relationships. We leverage AI in the background to make our agents better prepared and more efficient. The consumer should experience the benefit of AI, such as faster response times, more accurate information, and better-prepared agents, without ever feeling like they are speaking to a computer.


Transparency is also important. If the market analysis is being produced by AI, the agent should be able to understand it well enough to interpret, question, and supplement. If the agent is simply passing along an AI-generated report that they don’t understand, that’s not service. That’s a risk
 


What should brokerages consider when integrating AI tools into daily operations?


The first is adoption, and adoption is a non-negotiable metric. It doesn’t matter how powerful a tool is if agents are not going to use it. We’ve all been there: we buy a bunch of shiny new tools, we throw a webinar, and six months later, adoption is at 12%. We build for the agent who’s got 15 minutes to spare between showings, not for the one who’s got an hour to tweak settings
 


And then there are a handful of principles that we follow:

 

●       First, boring is better than brilliant. A system that works every time is better than a system that’s brilliant sometimes.

 

●       Second, assume that non-technical agents are going to misuse a tool, so build barriers first, features second.
 


●       Third, keep things simple, keep things small. The more tools, the more logins, the more training, the more things that can go wrong, so our dream is one dashboard.

 

The other thing brokerages underestimate is the speed advantage. With AI and Agentic Coding, the field has been leveled between small independent brokerages and large franchises. We can develop a custom internal tool in a weekend. A large franchise would take months to simply scope the same project through their committee structure. This is a real advantage, and it is growing. If you are not developing internal tools, you are falling behind the brokerages that are.

 

Finally, and this is an important point, do not let the technology of AI and Agentic Coding distract you from the fundamentals of a great brokerage business. The very best technology in the world will not save a brokerage that does not have its culture, hiring, and client service right to begin with. AI and Agentic Coding simply magnify what you have to begin with. If you have a mediocre business to begin with, you simply get faster mediocrity
 


How might AI adoption in real estate contribute to broader productivity growth across the U.S. economy?


Real estate affects almost every family in the country. It is one of the biggest industries and one of the biggest sources of employment. If AI can make this industry 15-20% more efficient, the ripple effects are enormous. Smoother closings, reduced friction costs, more effective allocation of agent time to high-value tasks, and fewer deals falling through because of preventable mistakes.


But I think the bigger picture is what this technology means to small and medium-sized businesses, not just real estate. We are a 148-agent brokerage competing with publicly traded companies and national chains. AI has given us the tools to build and move at a speed that would have required a dedicated engineering team in the past. That is a huge productivity gain, not just for us, but for every small business that learns how to leverage these tools.


The businesses that are going to drive the next big wave in economic productivity are not the ones that have the biggest budgets; they are the ones that have the ability to move the fastest. AI is an industry where speed, flexibility, and the willingness to build are rewarded, and that is where the smaller, focused operators have an advantage over the bigger, more bureaucratic ones. 


And if that is the case, and I think it is, then the productivity gains are going to be huge because you’re unlocking the innovation potential of many more businesses, not just the ones that could afford to play in the first place.
The Revenue Visibility Gap: What One Engineering Firm Reveals About Martech Maturity

The Revenue Visibility Gap: What One Engineering Firm Reveals About Martech Maturity

marketing 27 Feb 2026

 

By Marki Landerud, Vice President of Marketing at Marketri

Revenue visibility challenges are often treated as industry-specific.

In reality, they are martech maturity problems hiding inside operational silos.

One national engineering firm recently uncovered a significant blind spot inside its growth engine. Operationally, the organization was disciplined. Systems were stress tested. Assumptions were validated. Performance was instrumented and monitored over time.

Revenue generation was not.

The issue was not weak expertise or lack of effort. It was the absence of instrumentation across the CRM and revenue operations infrastructure.

For marketing and RevOps leaders, the lesson is clear: without structured lifecycle governance and clean data architecture, even sophisticated organizations operate with incomplete visibility.

The Illusion of Revenue Stability

Many services-based organizations, including engineering firms, have grown on the strength of reputation and relationships. Long-standing clients provide repeat work. Project managers maintain trusted networks. Conferences are attended. Proposals are written.

From the outside, activity looks healthy.

But when leadership asks fundamental revenue questions, the answers are often unclear:

  •        Which channels are producing high-value opportunities?
  •       How long does an opportunity remain in each lifecycle stage?
  •        Where are deals stalling?
  •        Which disciplines are driving new demand?
  •        How exposed are we to client concentration risk?

These are not just business development questions. They are martech maturity questions.

Without standardized CRM infrastructure, defined lifecycle stages, and integrated reporting, revenue can appear stable while underlying risk accumulates.

  •        A single large client reducing volume.
  •        A market segment cooling.
  •        A specialty discipline operating below target utilization.

These shifts often feel sudden. They are rarely unpredictable. They simply were not measured.

The Case Study: Instrumentation Before Intelligence

No engineering firm would operate critical infrastructure without monitoring performance data. Yet this firm was operating its growth engine without centralized CRM governance, consistent follow-up processes, or reliable attribution tracking.

Marketing activity existed. Business development activity existed. But the systems connecting them were fragmented.

Leadership could not clearly connect marketing investment to project type, utilization by discipline, or revenue contribution. Forecasting leaned more on intuition than on data.

Once a centralized CRM infrastructure was implemented, pipeline stages were defined, and follow-up was standardized, patterns became visible.

Leadership could see:

  •        Which service lines generated demand
  •       Which inquiries aligned with priority project types
  •        How long opportunities remained in evaluation
  •        Where additional visibility was needed

Close rates improved from approximately 30 percent to 43 percent after structured follow-up and deal tracking were introduced. Web-generated opportunities closed at 35 percent, outperforming common benchmarks. Conversions from paid search improved by 30 percent once campaigns aligned with priority disciplines and lifecycle data.

These were not just marketing wins. They were instrumentation corrections.

Behavior Changes Before Revenue Does

In this firm, revenue growth did not appear first. Operational discipline did.

When lifecycle stages were clearly defined and enforced, pipeline timing became measurable. Time in early opportunity stages and assessment phases could be tracked and improved.

Email outreach that once relied on individual effort saw engagement increase by 140 percent after structured workflows were implemented. Technical content authored by engineers generated more than 56,000 pageviews, with 73 percent of traffic coming from organic search. With proper attribution tracking, that visibility translated into qualified pipeline rather than isolated traffic metrics.

The measurable improvement in close rates and engagement preceded financial impact.

Within the first full year after implementing a structured growth system, revenue contribution exceeded total commercial investment by a factor of two.

The improvement did not come from increased activity.

It came from replacing anecdote with data.

Revenue Intelligence Requires a Clean Foundation

Many organizations are layering AI-driven revenue intelligence onto inconsistent CRM inputs and undefined lifecycle stages.

That approach rarely produces clarity.

AI does not compensate for poor data hygiene. It amplifies it.

In this case, meaningful performance improvement occurred only after the data foundation was stabilized. Lifecycle stages were standardized. Follow-up discipline was enforced. Attribution became visible.

Only then could forecasting become reliable.

Client Concentration and Enterprise Risk

This firm also faced revenue concentration exposure. A meaningful percentage of annual revenue was tied to a small number of long-standing clients. While stable on the surface, this structure introduced vulnerability.

Revenue concentration above 30 percent in a single client or sector is a structural risk few organizations would accept elsewhere in operations. Yet many tolerate it within their portfolios because CRM segmentation and reporting lack clarity.

With improved visibility into demand flow, leadership could intentionally diversify. They could identify which capabilities resonated in adjacent markets and reduce reliance on reactive selling.

Predictability improved.

And predictability strengthens enterprise value.

Applying Engineering Rigor to Martech Governance

Engineers solve complex problems through a clear process:

  •        Establish a baseline.
  •        Identify root causes.
  •        Implement measurable solutions.
  •        Monitor performance and refine over time.

Revenue operations and martech governance benefit from the same discipline.

A structured growth system connects marketing, business development, and sales into a single visible pipeline. It standardizes lifecycle definitions. It enforces data hygiene. It provides attribution clarity. It allows leadership to forecast with greater confidence.

This is not about increasing promotional activity.

It is about reducing uncertainty.

Closing the Visibility Gap

The revenue blind spot uncovered in this engineering firm is not unique to engineering.

It is a martech maturity issue.

When leaders cannot clearly trace how opportunities originate, how they progress, and how investment translates into predictable revenue contribution, they are operating without full visibility.

The organizations that close this gap do not necessarily work harder. They measure better.

 

 

How Collage is Reshaping Asset Management for Growing Brands

How Collage is Reshaping Asset Management for Growing Brands

marketing 26 Feb 2026

You’re stepping into the CEO role at Collage as the company enters its next phase of growth. What’s your vision for Collage moving forward?

Collage was built on a simple belief: managing important brand content shouldn’t feel heavy, complicated, or out of reach for growing teams.

As we enter our next phase, our vision is to become the modern infrastructure layer supporting how brands organize, distribute, and ultimately get more value from their content. We’re focused on doing the core work of Digital Asset Management exceptionally well — search, organization, and sharing — while removing the enterprise complexity that has defined the category for a decade.

We’ve grown consistently by serving teams who felt priced out or overwhelmed by legacy systems. Moving forward, we’ll double down on that momentum: improving migration experiences, strengthening our distribution capabilities, and investing in performance and simplicity.


How do you define Digital Asset Management today, and where do legacy platforms fall short?

At its core, Digital Asset Management should be the system of record for brand content — the place where assets are structured, searchable, and controlled, ultimately with the purpose of optimizing the value created by that content.

Historically, though, DAM platforms were designed more like vaults — places to store and protect files. Distribution was treated as a secondary feature. And over time, that inward focus created layers of complexity and rising costs.

Today, content flows outward constantly — to agencies, retail partners, sales teams, distributors, media outlets, technology platforms. If distribution isn’t central to the DAM, teams default to workarounds: email attachments, shared drives, Slack threads, ad-hoc file transfers. That fragmentation is where control and efficiency break down.

A modern DAM needs to function as an engine, not just a vault. It should activate content — making it easy to find, permission, package, and distribute without adding operational overhead.

Where legacy platforms often fall short is assuming every team needs enterprise-level complexity. Many growing organizations don’t. They need speed, clarity, and workflows that match how content actually moves.

That’s the shift we believe the category needs to embrace. By shifting focus to what we believe matters most for these teams, we can fundamentally make the platform more affordable without sacrificing impact for the vast majority of brands out there.


Collage describes itself as “distribution-first.” What does that mean in practice?


Distribution-first means designing DAM around how content is actually used — not just how it’s stored. At Collage, we focus on two core audiences:
  1. The teams who create and manage assets
  2. The audiences who consume and amplify them

By stripping away unnecessary complexity and streamlining distribution, we make modern DAM fundamentally more cost-effective, scalable and user-friendly. Practically, that shows up in capabilities like branded distribution portals, dynamic share links, asset embeds, flexible permissions, and modern search. Instead of responding to constant requests, teams can proactively enable access — reducing friction, eliminating bottlenecks, and accelerating brand amplification.


What problem does Collage solve that other DAM platforms struggle with?


We solve the mismatch between modern marketing teams and legacy software expectations.


Many growing brands are managing thousands of assets, collaborating with agencies, dealers, distributors, or partners — but they don’t have enterprise budgets or supporting IT departments.


Legacy DAM platforms often assume both. Collage removes the friction around:


- High contract costs


- Complex platforms that are difficult to adopt


- Fragmented systems for managing content


- Inability to activate content even if centralized and organized


We provide the essential capabilities teams actually use — search, tagging, portals, custom metadata, structured organization — without forcing them into a heavyweight system.


That balance of power and simplicity is where we see the most traction.


Who is Collage built for, and how does it fit into the broader DAM market?

Collage operates squarely in the Digital Asset Management category, but our philosophy is different. While many legacy vendors prioritize enterprise breadth and highly specialized functionality, we emphasize:


●        Simplicity over complexity
●        Access and distribution
●        Practical value over feature accumulation


Rather than building another feature-heavy system, we designed Collage from the ground up to focus on essential functionality with real impact. That focus allows us to deliver a powerful DAM experience without the cost or complexity that has deterred so many teams. That makes Collage especially well-suited for growing brands that have historically been underserved or priced out of DAM altogether — as well as modern teams that want speed, clarity, and control without enterprise overhead.


How do you think DAM needs to evolve to keep pace with modern marketing workflows?


Digital Asset Management needs to become more connected, more intelligent, and more flexible.


The future of DAM is less about managing files and more about orchestrating access — ensuring the right people can find and use the right assets at the right time.


First, platforms must integrate seamlessly with the tools marketers already rely on — design software, CMS platforms, automation tools, and perhaps most importantly, AI systems. DAM can’t be an isolated repository; it should facilitate a broader content ecosystem.


Second, search and metadata management must continue to evolve. As asset libraries grow, discovery becomes mission-critical. Teams need fast, precise ways to surface what matters. This is also essential for fully leveraging the emerging AI capabilities. 


And finally, AI will reshape expectations entirely. Most of what we see as AI in the category will become commoditized. However, the platforms that evolve distribution into AI-ready infrastructure will be the ones that deliver lasting value.


The companies that embrace this shift — from vault to engine, from storage to orchestration — will define the next generation of DAM.

 
What does success look like for Collage over the next few years?

Success means making DAM feel approachable, effective, and indispensable for modern teams. If Collage can help brands reduce friction, save time, and get more value from their content - while reducing complexity - we're doing our job.


Ultimately, our goal is to redefine expectations for DAM: simpler, more affordable, and built for  how content actually moves in today’s organizations now and in the future.
   

Page 1 of 16

REQUEST PROPOSAL