artificial intelligence 23 Mar 2026
Enterprise resource planning (ERP) platforms hold some of the most valuable operational data inside organizations—but extracting meaningful insights from that data often requires complex reporting tools and technical expertise. A new partnership aims to change that for the NetSuite ecosystem.
Detroit-based NetSuite consulting firm Snapshot and enterprise AI platform provider MindsDB have announced a strategic collaboration to deliver AI-powered conversational analytics for businesses running NetSuite.
The partnership combines Snapshot’s operational and ERP expertise with the AI capabilities of MindsDB Enterprise AI Platform. The goal: allow companies to interact with their NetSuite data—and data from other enterprise systems—through AI-driven insights, automation, and natural language queries.
For organizations that rely on Oracle NetSuite to manage financials, supply chains, and operations, the integration could unlock faster decision-making across business units.
ERP systems like NetSuite serve as the operational backbone for many organizations, managing everything from accounting and inventory to procurement and logistics. But turning that data into actionable insights typically involves manual reporting workflows or external analytics platforms.
Snapshot and MindsDB aim to streamline that process by creating an AI layer capable of connecting directly to enterprise datasets.
At the center of the collaboration is the Minds Enterprise platform, which acts as the AI backbone for deploying models, agents, and conversational analytics across enterprise data sources.
Snapshot will build on that foundation by applying its expertise in NetSuite data modeling and ERP implementation. The firm plans to translate complex ERP structures into AI-ready data frameworks that can power predictive analytics and automated insights.
“Our partnership with MindsDB allows us to deliver AI capabilities, autonomous agents, and statistical analytics that are deeply connected to how NetSuite customers actually operate,” said Tania Sottrel.
While the collaboration focuses on NetSuite environments, the platform is designed to extend beyond a single ERP system.
Companies will be able to combine NetSuite data with information from other operational platforms, including:
This multi-source approach enables organizations to build a unified intelligence layer across their operations rather than relying on isolated reporting tools.
According to Brad Gyger, the goal is to bring AI directly to where enterprise data already lives.
“MindsDB was built to bring AI directly to enterprise data,” Gyger said. “By partnering with Snapshot, we’re enabling NetSuite customers to unlock AI insights across their ERP systems and the broader platforms that power their businesses.”
Traditional ERP reporting often focuses on historical metrics—financial statements, inventory counts, or operational summaries. While useful, those reports rarely provide predictive insights or automated recommendations.
The Snapshot–MindsDB platform aims to expand those capabilities with AI-driven features such as:
Instead of static dashboards, companies will be able to ask questions in natural language and receive AI-generated insights derived from ERP and operational data.
This conversational approach reflects a broader shift toward AI-driven business intelligence, where machine learning models continuously analyze operational data to identify patterns and recommend actions.
The initial focus of the partnership will be industries where NetSuite plays a major role in operational and supply chain management.
These sectors include:
Companies in these industries often manage large product catalogs, complex supplier networks, and fluctuating demand conditions—making them ideal candidates for predictive analytics and automated operational insights.
By combining ERP data with external datasets, the companies hope to enable businesses to anticipate demand shifts, optimize inventory, and identify operational risks earlier.
Snapshot will lead development of NetSuite-specific AI solutions, including models, AI agents, ERP knowledge layers, and connectors that translate NetSuite data into AI-ready frameworks.
The broader goal is to create an intelligence layer that sits on top of NetSuite and integrates with other enterprise systems.
That approach could help companies modernize their analytics infrastructure without replacing existing ERP systems—an attractive option for organizations heavily invested in NetSuite environments.
“Our goal is to empower the NetSuite community with AI that understands how their businesses actually operate,” Sottrel said.
The NetSuite connector for MindsDB is available now, enabling organizations to begin integrating ERP data with the AI platform.
Additional solutions, pilot programs, and industry-focused deployments are expected to roll out throughout 2026 for NetSuite customers and ecosystem partners.
As enterprise AI adoption accelerates, partnerships like this one highlight a growing trend: embedding AI directly into core operational systems rather than layering analytics tools on top.
For NetSuite users seeking deeper insights from their operational data, that shift could transform ERP systems from record-keeping platforms into real-time intelligence engines for business decision-making.
Get in touch with our MarTech Experts.
artificial intelligence 23 Mar 2026
Turning business intelligence into measurable outcomes remains one of the biggest challenges organizations face when adopting AI. Now, research and market intelligence firm Keypoint Intelligence is attempting to close that gap with the launch of Keypoint Engage, a new AI-powered execution services suite designed to transform industry insights into real operational results.
The new offering combines automation, AI-driven tools, and managed services to help companies accelerate execution across sales, marketing, and customer service operations—areas where organizations often struggle to translate strategic insights into day-to-day performance improvements.
Rather than offering standalone software or advisory services, Keypoint Engage is positioned as a managed, outcome-focused solution that delivers operational execution based on Keypoint Intelligence’s decades of industry data and research.
For years, Keypoint Intelligence has been known for its research and testing in the print, imaging, and document technology industries. With Keypoint Engage, the company is shifting from a purely advisory role into a more operational one—helping organizations implement AI-driven strategies directly within their business processes.
According to Randy Dazo, the move reflects a common challenge companies face when adopting AI technologies.
“Businesses want the benefits of AI, but most don’t have the time, resources, or expertise to implement it effectively,” Dazo said. “Keypoint Engage closes the gap between insight and execution by delivering real action and measurable results.”
Instead of requiring companies to master complex AI tools or develop in-house expertise, Keypoint Engage provides end-to-end execution services, including automation, campaign support, and AI-driven operational tools.
The Keypoint Engage platform focuses on three primary operational areas where AI can deliver immediate business impact: sales execution, marketing execution, and service automation.
Each module within the suite is designed to streamline workflows and improve performance through a combination of AI agents, automation tools, and strategic guidance.
The sales-focused component of Keypoint Engage aims to help organizations generate and convert leads more efficiently.
Capabilities include:
By automating routine outreach and supporting more personalized engagement strategies, the platform aims to strengthen pipeline development and improve customer interactions.
Marketing teams often face pressure to launch campaigns faster while producing more content across multiple channels. Keypoint Engage addresses that challenge with AI-powered campaign planning and content development tools.
The platform supports:
AEO, or Answer Engine Optimization, focuses on optimizing content for AI-powered search and conversational platforms rather than traditional search engines alone.
By automating content development and campaign workflows, Keypoint Engage aims to reduce the time and manual effort required to deploy marketing initiatives.
Customer service operations represent another area where AI-driven automation is rapidly gaining traction.
Keypoint Engage introduces tools designed to streamline support interactions and improve service consistency, including:
The goal is to help organizations provide faster, more consistent customer experiences while reducing the operational costs associated with high-volume support environments.
The launch of Keypoint Engage marks a significant evolution for Keypoint Intelligence as the company moves deeper into operational services.
Traditionally, firms like Keypoint have focused on market analysis, benchmarking, and strategic insights. But as AI tools become more accessible, businesses increasingly want partners that can not only provide insights but also help execute strategies.
Keypoint Engage reflects that shift by blending industry expertise, automation technologies, and managed service delivery into a single offering.
The company says the suite will also help organizations better evaluate and optimize the role of print and document technologies within modern business workflows—areas where Keypoint Intelligence has long-standing expertise.
The broader market trend behind Keypoint Engage is the emergence of AI execution services—solutions that go beyond software to deliver managed outcomes.
Many organizations are enthusiastic about AI’s potential but lack the internal resources needed to integrate it into existing processes. That has created growing demand for vendors capable of delivering AI-powered business execution without requiring deep technical expertise from customers.
By packaging AI automation with industry intelligence and operational services, Keypoint Intelligence hopes to provide companies with a more accessible path to measurable results.
Keypoint Engage is available now as a managed service suite for organizations looking to accelerate sales, marketing, and service performance using AI-driven automation.
The company says additional resources—including a dedicated landing page and overview materials—are available to help organizations explore how the platform can support their operational strategies.
For businesses seeking to move beyond experimentation and toward real AI-driven execution, Keypoint Engage represents a new model: turning insight into action—and action into measurable growth.
Get in touch with our MarTech Experts.
artificial intelligence 23 Mar 2026
Advertising research firm MediaScience has introduced a new AI-driven approach that could fundamentally change how marketers test and optimize ad creative.
The company announced Creative Twin™, a new methodology that uses artificial intelligence to recreate advertisements with near-perfect accuracy and then test the impact of individual creative elements within them. The breakthrough will be formally presented at the Audience x Science Conference hosted by the Advertising Research Foundation.
Developed using proprietary technology within MediaPET.ai—a spinoff initiative from MediaScience—the system enables researchers to generate AI replicas of advertisements that audiences reportedly cannot distinguish from the originals.
For marketers increasingly focused on creative effectiveness and personalization, the technology promises something long considered nearly impossible: isolating and measuring the precise impact of individual creative decisions within an advertisement.
Creative optimization has traditionally relied on A/B testing or multiple ad versions, both of which can be expensive and time-consuming. Testing individual variables—such as a celebrity endorsement, visual style, or casting decision—often requires entirely new ad production.
Creative Twin aims to eliminate that barrier.
Once an advertisement is recreated as an AI-generated replica, researchers can systematically modify individual elements within the ad—such as talent, visuals, messaging, or background details—while keeping the rest of the creative identical.
The result is a controlled testing environment where marketers can measure exactly how each element influences audience response.
“This represents a fundamental shift in how advertising creative can be evaluated and optimized,” said Duane Varan. “For the first time, researchers can isolate and measure the contribution of individual creative elements within an advertisement.”
To validate the methodology, MediaScience conducted controlled testing with 812 U.S. respondents in collaboration with the Ehrenberg-Bass Institute, one of the world’s most respected academic marketing research institutions.
Participants were shown both original advertisements and AI-generated replicas created using the Creative Twin technology.
According to the study, respondents were unable to distinguish between the original ads and the AI-generated versions, confirming that the replicas maintained the full production quality and realism of the original creative.
That fidelity is critical, because it allows researchers to modify ad components without introducing unintended differences that could skew results.
With the AI-generated “twin” in place, advertisers can test a wide range of creative variables.
For example, marketers can explore:
Because each version of the ad remains visually identical except for the specific variable being tested, researchers can determine the incremental impact of each creative choice.
This level of control has historically been difficult to achieve without producing multiple expensive ad variations.
Beyond research applications, the technology could also reshape how brands personalize advertising at scale.
Addressable advertising—where ad creative is tailored to specific audience segments—often requires producing multiple versions of the same ad. Creative Twin enables marketers to generate these variations digitally without reshooting the ad.
One example highlighted by MediaScience involved a shampoo commercial.
In the original advertisement, the featured model had straight hair. Using Creative Twin, researchers generated an AI-modified version of the same ad where the model appeared with curly hair.
When shown to audiences with curly hair, the modified ad delivered significantly stronger results across several key marketing metrics, including:
The results suggest that subtle creative adjustments—when matched to the right audience—can meaningfully improve advertising effectiveness.
The methodology also opens the door to highly targeted creative variations across different industries.
MediaScience outlined several potential applications:
In each case, the modified ad retains the same production quality as the original version, ensuring that the test focuses solely on the element being evaluated.
One of the most intriguing aspects of Creative Twin is its potential to quantify the financial impact of creative decisions.
Advertising production often involves major investments in talent, filming, and design, but marketers rarely have precise data on which elements deliver the greatest return.
Creative Twin allows researchers to test these elements individually, helping brands determine:
For marketing teams under pressure to prove ROI on advertising budgets, that level of insight could be transformative.
The introduction of Creative Twin reflects a broader trend toward AI-driven experimentation in marketing and advertising.
As generative AI tools become more sophisticated, marketers are gaining the ability to simulate and test creative variations without the traditional production costs associated with advertising development.
For MediaScience, the technology represents the next phase in advertising research—moving from observation to controlled experimentation.
If widely adopted, AI-powered creative replication could reshape how brands design, test, and personalize advertising campaigns in the years ahead.
Get in touch with our MarTech Experts.
marketing 23 Mar 2026
Mobile shopping and creator-driven commerce are rapidly reshaping how consumers discover and buy products—and deep linking platforms are becoming essential infrastructure behind the scenes.
Mobile routing platform URLgenius reported record growth in 2025, routing more than 1 billion shopper sessions into major commerce apps and enabling an estimated $6.8 billion in mobile-driven commerce.
The milestone reflects rising demand for seamless mobile experiences across social media, creator platforms, and messaging channels—where most product discovery now begins.
The company said the surge was fueled by the rapid growth of creator-led commerce, increasing mobile-first workflows among marketing teams, and expanding adoption of the URLgenius Marketplace, its platform connecting creators with monetizable product links.
In 2025, URLgenius saw a sharp increase in activity across its platform as brands and creators relied more heavily on deep linking to convert social engagement into purchases.
Key performance highlights included:
The company also climbed the ranks of the Inc. 5000, reaching No. 156 among the fastest-growing software companies, up from No. 184 in 2024.
Despite the rapid growth of mobile commerce, many brands still struggle with broken or inefficient links that send shoppers to the wrong destination—or fail to open apps correctly.
Traditional link shorteners typically direct users to a mobile browser, even when the purchase should occur within a native app. That extra friction can lead to significant conversion losses.
URLgenius tackles this issue by routing each click directly to the intended in-app product or content destination, helping marketers avoid what the company describes as “post-click drop-off.”
The platform also works without requiring SDK integrations, allowing brands and agencies to deploy deep linking solutions quickly across multiple channels.
“Today’s shoppers move from app to app as they discover products through creators, social, and messaging,” said Brian Klais. “Creators run like businesses, but broken routing and lost attribution still kill conversion before the purchase even happens.”
A growing part of the platform’s momentum in 2025 came from AI-powered recommendations designed to help creators maximize earnings.
About 25% of creators used URLgenius recommendation and alert features during the year. These tools highlight high-performing products and monetization opportunities at the moment a creator generates a link.
The feature reflects a broader shift in the creator economy toward real-time monetization decisions, where creators choose affiliate opportunities as they publish content rather than researching programs after the fact.
By surfacing earning potential across hundreds of thousands of marketplace products, URLgenius aims to streamline that process and help creators generate revenue faster.
The platform is designed to support both independent creators and enterprise marketing teams managing large-scale campaigns.
URLgenius routes links across multiple marketing channels, including:
By directing shoppers to the correct in-app destination, the platform helps reduce conversion friction and improve attribution visibility—two major challenges in modern mobile marketing.
For enterprise teams managing multi-channel campaigns, clearer attribution also means better insight into how social posts, influencer campaigns, and messaging channels drive revenue.
As global marketplaces increasingly open their platforms to creators, many brands are adopting no-code linking solutions that simplify mobile shopping experiences.
URLgenius’ Instant App Linking feature enables brands to roll out deep linking across regions without complex technical integrations.
In one large global deployment highlighted by the company, this approach delivered over 150% growth in creator participation and nearly fivefold improvements in mobile conversion rates.
Those results underscore how critical seamless mobile routing has become in the modern commerce ecosystem.
Behind the scenes, platforms like URLgenius are becoming core infrastructure for the rapidly expanding creator commerce economy.
Consumers increasingly discover products through TikTok videos, Instagram posts, YouTube content, and messaging apps. But turning that discovery into a purchase often requires multiple transitions between platforms and apps.
Deep linking tools help eliminate those friction points by ensuring shoppers land directly inside the relevant app page rather than a generic browser experience.
As creator-led marketing continues to grow, the ability to track and optimize those mobile journeys is becoming a key priority for brands and agencies.
For URLgenius, the numbers from 2025 suggest that infrastructure built for mobile-first commerce and creator-driven marketing is becoming more important than ever.
Get in touch with our MarTech Experts.
marketing 23 Mar 2026
As organizations increasingly rely on telemetry and observability data to power operations and AI workflows, protecting sensitive information inside logs has become a growing challenge. A new partnership between two data infrastructure vendors aims to solve that problem without sacrificing usability.
Observability platform provider Coralogix and data privacy platform Skyflow announced a strategic collaboration designed to safeguard sensitive customer data embedded in logs while maintaining the full functionality needed for debugging, security analysis, and AI-driven operations.
The joint solution introduces a privacy-first observability model that replaces traditional redaction techniques with tokenization-based data protection, allowing organizations to keep telemetry data searchable and usable without exposing sensitive information.
Logs and telemetry are foundational to modern software operations. Engineering teams depend on them for diagnosing application issues, investigating incidents, and monitoring system health.
However, these data streams frequently contain sensitive information—such as customer identifiers, account numbers, or personal details—hidden within both structured data fields and unstructured log messages.
Most observability platforms address the issue by masking or removing sensitive data entirely. While that protects privacy, it often reduces the usefulness of logs.
Redacted logs can create operational problems such as:
According to Anshu Sharma, the traditional approach forces organizations into a difficult compromise.
“Once sensitive data is stripped out, teams lose the ability to search effectively, investigate incidents, or let AI agents reason over what actually happened,” Sharma said.
The Coralogix–Skyflow integration takes a different approach by using privacy-preserving tokenization rather than removing sensitive data entirely.
Instead of masking values permanently, Skyflow replaces sensitive data elements with consistent tokens. These tokens maintain the structural integrity of the data, enabling engineers and analytics systems to continue searching, correlating, and analyzing events.
The original sensitive values are stored separately within Skyflow’s secure environment and can only be accessed through controlled, policy-based permissions.
This approach ensures that observability data remains operationally useful while sensitive information remains protected.
For Ariel Assaraf, the partnership addresses a growing challenge for organizations relying on observability data as a system of record.
“Customers shouldn’t have to choose between safeguarding sensitive data and maintaining operational efficiency,” Assaraf said.
The collaboration is also designed to support the rising use of AI in observability systems.
AI-powered monitoring tools increasingly analyze telemetry data to detect anomalies, predict incidents, and automate remediation processes. However, exposing raw customer data to these systems introduces compliance and privacy risks.
With the tokenization model, AI tools can still analyze logs and telemetry without accessing the underlying sensitive data.
Key capabilities of the joint solution include:
The result is an observability environment that supports both AI-driven automation and strong data governance.
Global organizations also face increasing regulatory requirements around data residency and sovereignty.
Coralogix already allows customers to deploy observability workloads in specific geographic regions, helping organizations meet regional data compliance standards.
By integrating Skyflow’s runtime data control capabilities, companies can extend that model further—ensuring sensitive customer data remains isolated, access-governed, and auditable while operational telemetry stays locally accessible.
This architecture helps organizations operating across multiple jurisdictions minimize cross-border data exposure while maintaining operational visibility.
The rise of AI-driven infrastructure monitoring, automated operations, and data-heavy applications has made observability more important than ever.
But it has also increased the risk of sensitive information appearing in operational data streams.
The partnership between Coralogix and Skyflow reflects a broader shift toward privacy-by-design observability systems, where security and operational usability are built into the data pipeline from the start.
By combining observability analytics with privacy-preserving data controls, the companies aim to provide organizations with a new model for managing telemetry data in an AI-powered world.
Get in touch with our MarTech Experts.
marketing 23 Mar 2026
Enterprise marketing is entering what many executives now call the “AI-native era,” where intelligence isn’t just a tool—it’s the operational backbone of modern marketing systems. With that shift accelerating, Zeta Global is preparing to bring industry leaders together to discuss what comes next.
The company announced that Zeta Live 2026 will take place on October 8, 2026, in New York City, marking the sixth edition of the event and its most ambitious gathering yet. The conference will feature entrepreneur and entertainer Kevin Hart as its first confirmed keynote speaker.
Hosted by Zeta Global, the event has evolved into a major industry gathering for senior marketing executives, AI innovators, and technology leaders navigating the rapid transformation of marketing operations.
Since its launch, Zeta Live has grown alongside the increasing adoption of artificial intelligence across marketing technology stacks.
This year’s event will focus heavily on the shift from AI experimentation to fully operational AI-powered marketing systems, a transition many enterprises are currently navigating.
According to David A. Steinberg, that transformation is redefining how marketing teams operate.
“We are entering the era where AI doesn’t assist marketing—it defines it,” Steinberg said in the announcement. “Organizations that win will be the ones that turn intelligence into action faster than their competition.”
The 2026 program will examine how leading companies are embedding AI across their marketing ecosystems to better understand customer behavior, drive measurable revenue impact, and prove marketing’s contribution to business growth.
The first keynote headliner announced for Zeta Live 2026 is Kevin Hart, one of the world’s most recognizable entertainers and entrepreneurs.
Beyond his career as a comedian and actor, Hart has built a diversified business portfolio, serving as chairman and CEO of Hartbeat and founder of HartBeat Ventures.
At Zeta Live, Hart is expected to share insights from his journey building global brands across entertainment, media, and entrepreneurship—an experience that aligns with the conference’s focus on innovation, creativity, and growth.
Now entering its sixth year, Zeta Live has steadily expanded its role as a high-profile forum for discussions around AI-driven marketing transformation.
The event attracts a mix of enterprise marketing leaders, data scientists, technology vendors, and business strategists seeking to understand how artificial intelligence is reshaping customer engagement.
In recent years, conversations at Zeta Live have increasingly centered on topics such as:
As AI becomes embedded across enterprise technology stacks, these topics are moving from theoretical discussions to real operational strategies.
The theme of Zeta Live 2026 reflects a broader shift within the marketing industry.
Traditional campaign-based marketing models are gradually giving way to AI-powered intelligence systems capable of analyzing customer data, predicting behavior, and triggering real-time engagement.
In these environments, AI becomes the central decision-making layer—guiding everything from audience targeting and content delivery to customer lifecycle management.
For enterprise organizations managing massive datasets and multichannel engagement strategies, this transformation represents a fundamental change in how marketing teams operate.
Events like Zeta Live are increasingly positioned as forums where companies can learn from peers already implementing these systems.
While Zeta Global has confirmed the date and first keynote speaker, the full Zeta Live 2026 speaker lineup, venue, and detailed programming will be announced in the coming months.
Industry attendees can register to receive early invitations and updates as the conference schedule takes shape.
Given the growing influence of AI across marketing technology, the 2026 edition of Zeta Live is expected to draw an even larger audience of executives looking to understand how intelligence-driven marketing platforms will shape the next phase of digital growth.
Get in touch with our MarTech Experts.
marketing 20 Mar 2026
In a move that underscores the growing shift toward warehouse-native analytics, Kubit has announced a new integration with Snowflake that promises to simplify how enterprises analyze customer behavior and business performance—without moving data out of their core systems.
The pitch is straightforward: stop copying data into fragmented tools and start running analytics directly where it already lives.
That idea isn’t new, but Kubit is betting that tighter execution inside Snowflake’s AI Data Cloud—combined with explainable AI—can finally make it practical at scale.
Enterprises have increasingly standardized on Snowflake as a “single source of truth.” The problem? Product analytics and BI tools haven’t kept up. They often rely on separate pipelines, creating duplicate datasets, inconsistent metrics, and governance headaches.
Kubit’s integration tackles this by querying data directly inside Snowflake environments. That means product, growth, and analytics teams can track customer journeys, behavioral events, and key business metrics—like revenue and lifetime value—without exporting or reshaping data.
In practical terms, this reduces the lag between question and answer. It also cuts down on the quiet chaos of mismatched dashboards—a common pain point in large organizations.
The more interesting angle is Kubit’s AI layer.
Instead of bolting on opaque AI tools, Kubit introduces AI agents that generate and execute SQL queries directly within Snowflake. These agents operate within existing access controls and use a shared semantic layer to keep metrics consistent.
The result:
Anomaly detection across product and business metrics
Root-cause analysis for sudden changes
Natural language report generation
Narrative summaries backed by live queries
That last point matters. In an era where “AI insights” often feel like black boxes, Kubit is leaning into transparency. Every insight ties back to a verifiable query running in Snowflake—something data teams and auditors alike will appreciate.
Serko, a global travel tech provider, is already using Kubit with Snowflake to power product analytics for its Booking.com for Business platform.
Before Kubit, accessing insights reportedly took weeks. Now, product teams can self-serve analytics directly from governed warehouse data—without disrupting their Snowflake-first architecture.
That’s a telling example. The real value here isn’t just faster dashboards; it’s shifting analytics from a centralized bottleneck to a distributed capability across teams.
Kubit’s move lands at a time when the analytics stack is undergoing a quiet but significant transformation.
Traditional tools like Tableau and Looker helped define the modern BI era—but they often depend on extracted or modeled datasets. Meanwhile, newer players are pushing “warehouse-native” as the next evolution, aligning analytics directly with cloud data platforms.
Snowflake, for its part, has been steadily positioning itself not just as a storage layer, but as a full-fledged application and AI platform. Partnerships like this one reinforce that strategy.
The implication is clear: the center of gravity is shifting toward the data warehouse itself. Tools that don’t adapt risk becoming redundant layers.
Perhaps the most compelling aspect of Kubit’s approach is how it blends governance with AI.
Many organizations are racing to make data “AI-ready,” but struggle with trust, consistency, and compliance. By keeping AI execution inside Snowflake—and within existing controls—Kubit sidesteps a major barrier to enterprise adoption.
It’s a subtle but important shift. Instead of asking companies to trust new systems, Kubit extends the ones they already trust.
Kubit’s Snowflake integration isn’t flashy, but it hits on a real pain point: the fragmentation of analytics in modern data stacks.
If it delivers on its promise, the combination of warehouse-native analytics and transparent AI could help enterprises move faster without sacrificing control—a balance that’s been notoriously hard to achieve.
And in a market crowded with analytics tools, that might be the differentiator that actually sticks.
Get in touch with our MarTech Experts.
marketing 20 Mar 2026
In a bid to close one of enterprise security’s most persistent blind spots, Keeper Security has introduced KeeperDB, a vault-native database access feature designed to bring zero-trust principles directly to how teams interact with sensitive data.
Set for official debut at the RSA Conference 2026, KeeperDB extends the company’s Privileged Access Management (PAM) platform by embedding database access controls directly into its existing vault environment. The goal: eliminate risky workarounds and bring order to a notoriously fragmented part of enterprise infrastructure.
Despite years of investment in identity and access management, database access remains surprisingly outdated.
Developers and administrators often rely on a patchwork of desktop clients, shared credentials, and VPN tunnels to access production systems. These methods may be convenient, but they come with serious trade-offs: limited visibility, inconsistent policy enforcement, and a heightened risk of credential leaks or insider misuse.
That’s a problem when databases house some of the most sensitive assets an organization owns—from customer records to financial data.
KeeperDB’s premise is simple: if privileged access is already governed inside a vault, database access should be too.
KeeperDB embeds database session management directly into the Keeper Vault, allowing users to initiate connections without ever exposing credentials.
Instead of copying passwords into external tools, users can launch sessions from within the vault itself—via either a browser-based interface or command-line access. Initial support includes widely used systems like MySQL, PostgreSQL, Oracle, and Microsoft SQL Server.
The shift may sound incremental, but it addresses a core issue: credential sprawl. By keeping secrets contained and never revealed to endpoints, Keeper reduces one of the most common attack vectors in enterprise environments.
Where KeeperDB stands out is in how it applies zero-trust principles to database workflows.
Access is governed through centralized policies, with granular controls such as read-only permissions and regulated data transfers. Every session is fully recorded, creating an audit trail that security teams can review for compliance or incident response.
In effect, Keeper is treating database access the same way modern systems treat privileged infrastructure access—with strict verification, minimal exposure, and full visibility.
That’s increasingly critical as organizations face rising pressure to demonstrate compliance and prevent data exfiltration in real time, not after the fact.
Security tools often fail when they disrupt how teams actually work. KeeperDB tries to avoid that trap.
The platform offers a modern, browser-based interface for direct access, but also introduces KeeperDB Proxy for organizations that want to keep using existing database clients. The proxy routes connections through Keeper’s control layer, enforcing policies without forcing teams to abandon familiar tools.
It’s a pragmatic approach—one that acknowledges that security adoption depends as much on usability as it does on technical rigor.
KeeperDB enters a space where database access is typically handled by standalone tools or bolted onto broader platforms. Vendors like CyberArk and HashiCorp (via Vault) have tackled adjacent problems, particularly around secrets management and privileged access.
What Keeper is doing differently is collapsing those layers into a single, vault-native experience. Instead of integrating multiple tools, it’s positioning the vault itself as the control plane for database access.
That aligns with a broader industry trend: consolidating security functions to reduce complexity and improve governance.
KeeperDB reflects a growing recognition that securing identities isn’t enough—organizations also need to secure how those identities interact with data.
By embedding database access into a zero-trust, zero-knowledge architecture, Keeper is aiming to eliminate an entire class of risks tied to exposed credentials and unmanaged sessions.
For enterprises, the payoff could be significant: fewer tools to manage, stronger compliance posture, and reduced risk of breaches tied to database access.
But adoption will hinge on execution. If KeeperDB can deliver both airtight security and a frictionless user experience, it could reshape how organizations think about database access altogether.
Get in touch with our MarTech Experts.
Page 24 of 1465