News | Marketing Events | Marketing Technologies
GFG image

News

TotalEnergies Marketing India Unveils 2026 Lubricants Roadmap, Launches New Products at National Distributor Meet

TotalEnergies Marketing India Unveils 2026 Lubricants Roadmap, Launches New Products at National Distributor Meet

marketing 16 Feb 2026

TotalEnergies Marketing India Private Limited (TEMIPL) is sharpening its focus on India’s fast-evolving lubricants market, using its annual distributor convention to unveil a 2026 strategic roadmap and roll out new product launches across its automotive portfolio.

Held under the theme “One Vision, One Direction,” the convention brought together 200 distributors and partners from automotive and industrial lubricants segments—an unmistakable signal that channel strength remains central to the company’s India growth strategy.

In a market defined by price sensitivity, rising two-wheeler volumes, and intensifying competition from domestic and multinational brands, TEMIPL’s message was clear: growth will be distributor-led, innovation-backed, and performance-driven.

A 2026 Roadmap Built Around Network Strength

At the core of the event was TEMIPL’s 2026 roadmap, which emphasizes:

  • Deepening trust and engagement within its distribution ecosystem

  • Strengthening operational efficiency across regional networks

  • Accelerating sustainable growth through product innovation

India remains one of the most competitive lubricants markets globally, with strong incumbents such as Castrol, Shell, Indian Oil’s Servo, and Gulf Oil vying for share in both automotive and industrial segments. For multinational players, distribution depth often determines success more than brand equity alone.

By foregrounding its distributor community, TEMIPL is reinforcing a long-standing industry truth: in India’s fragmented aftermarket, last-mile access is everything.

Viken Najarian, CEO Lubricants Automotive India, underscored this dynamic, calling distributors the backbone of success in India’s dynamic lubricants landscape—critical to ensuring regional reach and responsiveness to shifting customer demand.

New Product Push Targets High-Volume Segments

A major highlight of the convention was the launch of three new lubricants:

  • TotalEnergies Hi-Perf Royal Cruiser 15W-50

  • Hi-Perf Scooter 5W-30

  • ELF Moto 4 Scooter 5W-30

The additions target high-volume two-wheeler and scooter segments—categories that remain central to India’s mobility ecosystem, particularly in Tier 2 and Tier 3 cities.

The company also introduced revamped packaging across its TotalEnergies and ELF product ranges, a move likely aimed at strengthening shelf visibility and brand recall in crowded retail environments.

Packaging updates, while often overlooked, can be strategically significant in India’s lubricants market, where differentiation at the point of sale plays a critical role in influencing mechanic and retailer recommendations.

India as a Strategic Growth Engine

Vincent Minard, Director of Automotive Lubricant APME at TotalEnergies, described India as one of the company’s most exciting growth markets.

That framing aligns with broader industry trends. India’s expanding vehicle parc, growing middle class, and increasing focus on vehicle maintenance are sustaining lubricants demand—even as electric vehicle adoption gradually reshapes long-term consumption patterns.

While EVs may alter lubricant demand over time, internal combustion engines—particularly in two-wheelers and commercial vehicles—will continue to dominate India’s roads in the medium term. For lubricant manufacturers, that creates a window for consolidation and premiumization.

TEMIPL’s roadmap appears designed to capitalize on that window by combining product innovation with tighter channel execution.

Rewarding Performance, Reinforcing Loyalty

The convention also included recognition of top-performing distributors, with premium rewards presented to standout partners.

Such incentives serve a dual purpose. They reinforce loyalty in a competitive channel landscape and signal that performance metrics—sales growth, market penetration, operational compliance—will be closely aligned with future strategic ambitions.

In an environment where distributors often carry multiple brands, engagement and incentive alignment can materially influence market share outcomes.

The Bigger Picture: Competing in a Crowded Market

India’s lubricants market is projected to remain one of the largest globally, driven by commercial transport growth, expanding infrastructure activity, and sustained two-wheeler demand.

However, competition is intensifying:

  • Domestic refiners are strengthening retail footprints.

  • Global majors are pushing premium synthetic offerings.

  • Price fluctuations in base oils continue to impact margins.

Against this backdrop, TEMIPL’s emphasis on innovation, operational excellence, and ecosystem empowerment reflects a pragmatic growth strategy rather than a flashy reinvention.

The 2026 roadmap signals that the company is betting on disciplined execution—strengthening relationships, refreshing product lines, and optimizing distribution—to secure sustainable gains.

 

If successfully implemented, the strategy could reinforce TotalEnergies’ competitive position in one of the world’s most strategically important lubricants markets.

Get in touch with our MarTech Experts.

NetActuate Expands Mumbai Cloud With ONE IaaS to Power Hybrid, AI-Ready Infrastructure

NetActuate Expands Mumbai Cloud With ONE IaaS to Power Hybrid, AI-Ready Infrastructure

artificial intelligence 16 Feb 2026

NetActuate has expanded its Mumbai cloud platform, enhancing Public Cloud, Private Cloud, Virtual Private Cloud (VPC), and Hybrid Cloud services through its Open Network Edge (ONE) Infrastructure-as-a-Service platform. The move strengthens the company’s footprint in one of Asia’s fastest-growing digital and connectivity hubs—and signals rising enterprise demand for flexible, compliance-ready infrastructure in India.

For enterprises navigating AI workloads, regulatory scrutiny, and performance-sensitive applications, location matters. And in India, Mumbai remains ground zero.

What’s New: ONE-Powered Cloud in Mumbai

At the heart of the expansion is NetActuate’s Open Network Edge (ONE) IaaS platform—an open-source-based infrastructure stack designed to give customers standardized deployment capabilities across global locations.

The Mumbai Data Center now offers:

  • Expanded Public Cloud capacity

  • Dedicated Private Cloud environments

  • Virtual Private Cloud (VPC) configurations

  • Hybrid Cloud architectures for mixed workloads

The infrastructure is managed through NetActuate’s customer portal and includes out-of-the-box implementations of widely deployed operating systems, monitoring systems, and orchestration tools.

In practical terms, this means enterprises can architect and scale cloud environments faster while maintaining consistency across regions—a priority for companies operating multi-location deployments.

NetActuate also maintains a data center presence in Chennai, enabling route diversity and improved regional reach across India.

Why Mumbai Is Strategic

Mumbai isn’t just another metro expansion. It’s India’s primary international digital gateway.

The city hosts high-capacity subsea cable landings and several announced cable routes designed to expand global bandwidth and redundancy. For enterprises delivering distributed applications, streaming services, fintech platforms, or SaaS products, proximity to these interconnection points reduces latency and improves resilience.

As internet adoption grows and India’s startup ecosystem scales globally, demand for predictable performance and low-latency user experiences is climbing.

Mark Mahle, CEO of NetActuate, framed the upgrade as a response to more than a decade of growth in India, positioning the enhanced platform as a flexible mix-and-match environment where enterprises can balance performance, cost control, and data residency.

Hybrid and Data Residency: The Real Drivers

Cloud expansion announcements are common. What’s more telling is the emphasis on hybrid and private infrastructure in this rollout.

India’s regulatory environment increasingly emphasizes local data handling and compliance. Sectors such as fintech, healthcare, and government services must meet strict governance and residency requirements.

By offering Private Cloud and VPC options in Mumbai, NetActuate is targeting enterprises that want:

  • Greater control over data placement

  • Predictable performance for regulated workloads

  • Flexibility to integrate with global infrastructure

Hybrid cloud designs are especially relevant. While hyperscalers dominate public cloud, many enterprises are adopting blended models—combining public elasticity with private control for sensitive data.

NetActuate’s positioning suggests it aims to complement, rather than compete head-on with, hyperscale providers by offering edge-proximate infrastructure with operational control.

AI and Data-Intensive Workloads on the Rise

The timing of the upgrade also aligns with increasing demand for AI-enabled and analytics-heavy applications.

AI workloads often require scalable compute, efficient data routing, and proximity to end users or data sources. Deploying infrastructure near key interconnection hubs like Mumbai can reduce inference latency and improve distributed model performance.

With ONE capabilities now active in Mumbai, NetActuate is signaling readiness to support:

  • Modern application delivery pipelines

  • Analytics and data processing platforms

  • AI-driven services requiring flexible scaling

As AI adoption expands beyond experimentation into production environments, infrastructure providers must support both burst compute demand and compliance constraints.

Competing in India’s Infrastructure Boom

India’s data center and cloud market is experiencing significant growth, driven by digital payments, OTT platforms, enterprise SaaS, and government digitization initiatives.

Global cloud giants such as AWS, Microsoft Azure, and Google Cloud continue expanding their Indian footprints. Meanwhile, regional and edge-focused providers are carving out space by emphasizing:

  • Network route diversity

  • Customizable hybrid architectures

  • Cost transparency and operational control

NetActuate’s Mumbai enhancement fits squarely into this latter strategy—offering extensible infrastructure built on open standards and tailored to distributed deployments.

The company’s presence in both Mumbai and Chennai also strengthens redundancy, a critical requirement for enterprises seeking high-availability deployments.

The Bigger Picture

Cloud conversations in India are evolving from “move to cloud” to “optimize for resilience and compliance.”

Enterprises are less concerned with pure migration and more focused on:

  • Latency optimization

  • Data governance

  • Cost-performance balance

  • AI-readiness

By expanding its ONE IaaS platform in Mumbai, NetActuate is aligning with that shift—providing flexible infrastructure that supports hybrid designs while anchoring workloads close to one of the region’s most strategic connectivity nodes.

 

For enterprises scaling across India and beyond, the message is straightforward: infrastructure optionality is no longer a luxury. It’s table stakes.

Get in touch with our MarTech Experts.

Pinterest Hits $4.2B in 2025 Revenue as AI Search Fuels Global User Growth

Pinterest Hits $4.2B in 2025 Revenue as AI Search Fuels Global User Growth

artificial intelligence 16 Feb 2026

Pinterest, Inc. closed out 2025 with record revenue and a clear message to advertisers: commercial intent still matters.

The social discovery platform reported $4.22 billion in full-year revenue, up 16% year over year, alongside 619 million global monthly active users (MAUs)—a 12% jump from 2024. Q4 revenue reached $1.32 billion, up 14% year over year. Adjusted EBITDA rose 23% for the year to $1.27 billion, while free cash flow climbed 33% to $1.25 billion.

In an ad market that’s been anything but predictable, Pinterest’s results point to a company gaining efficiency—and sharpening its pitch around AI-powered discovery and high-intent shopping behavior.

AI Search Is Driving Engagement—and Ad Dollars

CEO Bill Ready highlighted more than 80 billion monthly searches on the platform, crediting ongoing investments in AI-driven visual search and recommendation systems. Pinterest has been positioning itself less as a social network and more as a “visual discovery engine”—a distinction that’s increasingly important as generative AI reshapes search and commerce.

Unlike passive scrolling platforms, Pinterest’s use case often begins with planning: outfits, home renovations, travel, events. That intent translates into higher-value ad inventory, particularly as retailers and performance marketers push for measurable ROI.

In a digital ad ecosystem dominated by players like Meta Platforms and Google, Pinterest’s edge isn’t scale—it’s context. Users arrive with purpose. The company’s challenge has been monetizing that intent at scale, particularly outside North America.

The 2025 numbers suggest progress.

International Growth Is the Real Story

While U.S. and Canada revenue grew 10% year over year to $3.17 billion, the breakout gains came overseas.

  • Europe revenue surged 31% to $775 million.

  • Rest of World revenue jumped 62% to $274 million.

ARPU trends reinforce the shift. Global ARPU for 2025 reached $7.21 (up 4%), but Europe climbed 21% to $5.12, and Rest of World rose 40% to $0.83.

Those figures still trail U.S. and Canada ARPU, which hit $30.84 for the year, but the gap represents opportunity. Pinterest’s monetization runway outside North America remains substantial—particularly as it builds out localized ad sales and measurement tools.

By comparison, many social and commerce platforms are wrestling with saturated domestic ad markets and regulatory friction in Europe. Pinterest appears to be threading the needle: expanding internationally while improving monetization efficiency.

Profitability Looks Healthier—With a Caveat

On a GAAP basis, net income dropped sharply year over year—to $417 million in 2025 from $1.86 billion in 2024. That decline reflects prior-year accounting impacts rather than core operating weakness.

Non-GAAP net income rose 22% to $1.10 billion, and adjusted EBITDA margins improved to 30% for the year (up from 28% in 2024). Q4 adjusted EBITDA margin held steady at 41%.

Cash flow trends were particularly strong:

  • Operating cash flow: $1.28 billion (+33%)

  • Free cash flow: $1.25 billion (+33%)

For B2B marketers and MarTech vendors, cash flow stability matters. It signals Pinterest has room to continue investing in AI tooling, ad products, and international expansion without sacrificing profitability.

619 Million MAUs—and Still Climbing

Global MAUs reached 619 million, up from 553 million in 2024.

Regional breakdown:

  • U.S. & Canada: 105 million (+4%)

  • Europe: 158 million (+9%)

  • Rest of World: 356 million (+16%)

The fastest user growth is happening in emerging markets, reinforcing the revenue upside internationally. The company’s ability to convert that audience into ad dollars—without undermining user experience—will determine whether ARPU acceleration continues in 2026.

Engagement metrics also remain strong. With 80+ billion monthly searches, Pinterest is increasingly functioning as a hybrid of search engine and commerce platform—a space that’s heating up as generative AI reshapes traditional search paradigms.

Q1 2026 Outlook: Solid, Not Spectacular

For Q1 2026, Pinterest expects revenue between $951 million and $971 million, representing 11%–14% year-over-year growth. The company projects a roughly three-point foreign exchange tailwind.

Adjusted EBITDA is forecast between $166 million and $186 million.

The guidance suggests steady momentum but not acceleration—consistent with a broader digital ad market that remains cautious amid macroeconomic volatility and shifting performance marketing budgets.

The Bigger Picture: Pinterest’s Monetization Pivot

Pinterest’s 2025 performance underscores a strategic shift underway.

The company has been restructuring its sales and go-to-market approach to better align monetization with commercial intent. In practical terms, that means:

  • Smarter AI-driven ad placement

  • Stronger performance measurement tools

  • Closer integration with retailers and commerce partners

  • International sales expansion

In a landscape where ad dollars increasingly flow toward measurable outcomes, Pinterest’s pitch is simple: users come to plan, not just to scroll.

If it can continue translating visual search engagement into performance-driven ad revenue—especially overseas—Pinterest could carve out a durable position between social discovery and commerce enablement.

For marketers, that makes it more than a lifestyle platform. It’s becoming infrastructure for intent-driven advertising.

Key Takeaways for MarTech Leaders

  • AI-powered visual search is becoming central to Pinterest’s differentiation.

  • International monetization is accelerating, with Europe and Rest of World driving outsized growth.

  • Cash flow strength supports continued investment in AI and ad tooling.

  • ARPU expansion outside North America remains a major upside lever.

  • The platform’s positioning around “commercial intent” aligns with performance marketing trends.

 

Pinterest may not command the scale of the largest ad platforms—but its mix of intent, AI, and international growth suggests it’s evolving into a more formidable player in the MarTech stack.

Get in touch with our MarTech Experts.

Provenir Unveils Agentic AI Decision Intelligence Platform to Streamline Risk, Credit, and Compliance

Provenir Unveils Agentic AI Decision Intelligence Platform to Streamline Risk, Credit, and Compliance

artificial intelligence 16 Feb 2026

In the race to operationalize AI inside financial services, point tools are starting to look dated. Today, Provenir is betting that consolidation—not more fragmentation—is what banks and lenders need.

The company has launched a revamped Decision Intelligence platform that brings together data ingestion, machine learning models, decision orchestration, optimization, and now agentic AI capabilities into a single, continuous system. The goal: help financial institutions turn raw customer data into real-time, explainable decisions without bouncing between disconnected systems.

It’s a bold pitch in a market where AI decisioning has moved from “nice-to-have differentiator” to operational necessity.

What’s Actually New

At the heart of the announcement is a more tightly integrated platform architecture—and the addition of agentic AI features designed to actively assist users rather than simply surface analytics.

1. Unified Decision Intelligence Core

Provenir’s platform combines:

  • Data ingestion and enrichment

  • Machine learning model management

  • Real-time and batch decisioning

  • Continuous optimization and feedback loops

Instead of separating analytics, decision engines, and monitoring tools, Provenir claims its system executes decisions, measures outcomes, learns from results, and recommends improvements—all within one environment.

That closed-loop design is increasingly important in regulated sectors like lending, where speed must coexist with auditability.

2. Agentic AI Assistant

A newly embedded AI assistant introduces natural language access to platform capabilities. Users can:

  • Query datasets conversationally

  • Understand decision logic and outputs

  • Automate tasks such as document review

  • Interact with workflows without deep technical expertise

This mirrors a broader shift across enterprise software, where AI copilots are becoming standard in everything from CRM to cloud management platforms. The difference here is domain specificity: Provenir is embedding agentic AI directly into risk and credit decisioning workflows.

3. Advanced Model Management and Simulation

Provenir is also emphasizing improved model governance and testing capabilities. Users can:

  • Monitor and compare machine learning model performance

  • Run simulations to test strategy shifts

  • Reduce testing cycles from months to weeks—or even days

In volatile economic conditions, that kind of agility matters. Institutions can quickly test policy changes against shifting credit risk environments or regulatory updates before deploying them live.

AI With Guardrails: Human-in-the-Loop Design

Financial services is not Silicon Valley’s playground; it’s heavily regulated terrain. Provenir is positioning its “human-in-the-loop” framework as a key differentiator.

The platform offers:

  • Transparency into how AI models generate decisions

  • Explainability tools for audit and compliance

  • Governance controls aligned with regulatory standards

With growing scrutiny around AI accountability in lending and underwriting, explainability isn’t optional—it’s existential.

LLM Integration: OpenAI, Anthropic, and Private AI via AWS

Provenir is also expanding its Global Data Marketplace into what it describes as a unified hub for both data and AI.

The company now integrates leading public and private large language models, including:

  • OpenAI

  • Anthropic

Customers can access these models through pre-integrated APIs or deploy private instances hosted via Amazon Web Services Bedrock for sensitive workloads.

This hybrid AI strategy reflects a growing enterprise trend: organizations want cutting-edge LLM capabilities but without sacrificing data residency, compliance, or control.

By embedding LLM access directly into decision workflows, Provenir is positioning itself as a governed gateway rather than a generic AI layer.

Why This Matters Now

The timing isn’t accidental.

Financial institutions are facing:

  • Rising customer expectations for personalization

  • Increased M&A activity

  • Economic uncertainty affecting credit risk

  • Regulatory tightening around AI transparency

  • Pressure to modernize legacy risk systems

Traditional decisioning stacks often involve siloed data lakes, separate model environments, disconnected rule engines, and patchwork compliance tools. That fragmentation slows innovation and complicates governance.

Provenir’s platform approach aims to collapse those silos into a single operational layer for decision intelligence.

If it works as advertised, the benefits could include:

  • Faster deployment of new lending products

  • Improved risk/reward optimization

  • Reduced operational overhead

  • More consistent decision logic across channels

  • Better alignment between business goals and AI outcomes

Competing in a Crowded AI Decisioning Market

Provenir operates in a competitive landscape that includes credit bureau decisioning platforms, fintech orchestration engines, and enterprise AI vendors pushing into financial services.

What differentiates Provenir’s announcement is its emphasis on:

  • End-to-end orchestration

  • Continuous learning loops

  • Embedded LLM integration

  • Human oversight built into the system

Rather than positioning AI as an overlay, Provenir is pitching AI as infrastructure.

That distinction could resonate with mid-sized lenders and large financial institutions looking to modernize without assembling multi-vendor AI stacks.

Use Cases Across Risk and Credit

The platform supports:

  • Real-time underwriting

  • Fraud detection

  • Customer onboarding

  • Credit line management

  • Portfolio monitoring

  • Regulatory reporting

It scales from smaller lenders to large multinational banks, handling both real-time and batch processing environments.

For institutions operating across jurisdictions, the ability to localize decision logic while maintaining centralized governance may prove particularly valuable.

The Bigger Picture: Decision Intelligence as Strategy

“Decision intelligence” is increasingly becoming its own category, sitting at the intersection of AI, analytics, and business strategy.

Instead of focusing solely on predictive models, organizations are now asking:

  • How do we connect decisions to measurable outcomes?

  • How do we adapt policies in near real time?

  • How do we ensure AI decisions are compliant and explainable?

Provenir’s unified platform strategy speaks directly to those concerns.

If AI adoption in financial services is moving from experimentation to operationalization, then infrastructure-level solutions—rather than isolated AI features—are likely to define the next phase.

What to Watch

The key questions for Provenir going forward:

  • How seamlessly can institutions migrate from legacy systems?

  • Will customers adopt public LLM integrations or default to private AI deployments?

  • Can Provenir maintain performance and compliance as regulatory frameworks evolve?

The agentic AI layer adds appeal, but execution will determine whether this is incremental innovation or meaningful transformation.

What’s clear is that AI-powered decisioning is no longer optional. Institutions that can’t adapt risk being outpaced by competitors who can move faster, personalize smarter, and manage risk more precisely.

 

Provenir is betting its unified Decision Intelligence platform is the engine that makes that shift possible.

Get in touch with our MarTech Experts.

Coreline Soft and INFINITT Roll Out “Zero-Click” AI for U.S. Radiology—No Logins, No Workflow Disruptions

Coreline Soft and INFINITT Roll Out “Zero-Click” AI for U.S. Radiology—No Logins, No Workflow Disruptions

artificial intelligence 16 Feb 2026

The medical AI market has matured. It’s no longer enough to boast benchmark-beating algorithms. Hospitals and imaging centers now want something more practical: AI that works inside existing systems without slowing clinicians down.

That’s the premise behind a new partnership between Coreline Soft and INFINITT North America, which has delivered a fully automated, “zero-click” AI reading solution now live in U.S. radiology practices.

Instead of asking radiologists to jump between dashboards or sign into yet another platform, the companies have embedded Coreline’s AVIEW AI directly into the INFINITT PACS environment. The result: AI-powered insights appear natively within the radiologist’s existing workflow—no extra logins, no manual triggers, no workflow detours.

In a field where seconds matter and cognitive load is high, that detail is more than cosmetic. It’s strategic.

From Algorithm Accuracy to Operational Infrastructure

The global medical AI conversation has shifted over the past five years. Early entrants focused heavily on detection rates and sensitivity metrics. But clinical adoption has lagged when tools disrupted established reading patterns.

Radiologists don’t want another screen. They want smarter readings inside the one they already use.

By deeply embedding Coreline’s AVIEW platform into INFINITT PACS, the integration transforms AI from an optional add-on into background infrastructure. The system automatically:

  • Performs coronary artery calcium (CAC) scoring

  • Enhances lung nodule detection

  • Reduces overall case reading times

For high-volume radiology groups processing more than 6,000 cases per quarter, these efficiencies compound quickly.

A “Zero-Click” Model in Practice

The first U.S. site to adopt the solution, ImageCare Radiology, reported a rapid rollout across its network—completed within one month. According to its leadership, the impact was immediate in both speed and diagnostic accuracy.

The phrase “zero-click” isn’t marketing fluff here. It signals something critical: the AI activates without user intervention.

That matters because adoption in radiology often fails at the friction point. Even small workflow disruptions can stall usage. By eliminating manual triggers and secondary interfaces, the system keeps the radiologist’s attention on interpretation rather than navigation.

This approach aligns with broader industry trends. Enterprise AI vendors in healthcare are increasingly pursuing deep PACS and EHR integrations rather than standalone AI dashboards. The goal is invisible intelligence, not visible complexity.

Revenue Impact: From Detection to Follow-Up

Beyond clinical metrics, the integration carries financial implications.

Improved detection sensitivity and automated CAC scoring can increase downstream follow-up exams. In value-based care environments and screening programs, that translates directly into measurable ROI.

For imaging networks handling thousands of cases quarterly, even modest increases in follow-up rates can create meaningful revenue uplift. AI that identifies more actionable findings doesn’t just improve care—it expands billable opportunities.

That dual impact—clinical performance plus financial return—is becoming a decisive factor in AI procurement decisions.

The INFINITT Integration Suite: Built for U.S. Reimbursement

To accelerate adoption in the U.S., Coreline Soft has launched the INFINITT Integration Suite, a dedicated package tailored specifically for INFINITT users.

The suite enables:

  • Rapid deployment of AI modules

  • Support for CPT-based reimbursement

  • Fully automated, hands-free workflows

Reimbursement alignment is particularly important in the U.S. healthcare system, where financial viability often dictates technology adoption. By incorporating CPT code support, the companies are addressing a common bottleneck in AI commercialization.

Instead of forcing radiology groups to navigate reimbursement uncertainty, the integration is designed to plug directly into established billing structures.

A Competitive Thoracic AI Portfolio

Coreline Soft isn’t stopping at workflow integration. The company is positioning its broader thoracic AI portfolio for visibility at the 2026 annual meeting of the Society of Thoracic Radiology.

The lineup includes:

  • AVIEW LCS, positioned as a “First Reader” solution for lung cancer screening

  • AVIEW Lung Texture for interstitial lung disease (ILD)

  • AVIEW COPD for chronic obstructive pulmonary disease assessment

The portfolio also emphasizes Opportunistic Screening—extracting multiple clinical insights from a single low-dose CT scan. That means simultaneously evaluating:

  • Lung cancer risk

  • COPD indicators

  • Cardiovascular risk markers

This multi-condition analysis from one imaging study reflects a growing industry push toward comprehensive, AI-enhanced diagnostic value from existing scans.


Why This Matters in 2026

Radiology is under mounting pressure. Workforce shortages persist, imaging volumes continue to rise, and demand for screening programs—particularly lung cancer screening—is expanding.

At the same time, AI vendors are proliferating.

What differentiates this deployment is not simply detection capability but operational embedment. In an increasingly crowded AI imaging market, seamless integration may matter more than marginal gains in sensitivity.

Healthcare IT buyers are asking new questions:

  • Does the AI fit into existing PACS systems?

  • Can it scale across multi-site networks quickly?

  • Does it support reimbursement models?

  • Will clinicians actually use it?

By eliminating friction, Coreline Soft and INFINITT are attempting to answer all four at once.


The Bigger Trend: Invisible AI Wins

The most successful AI in healthcare may be the least noticeable.

As the market evolves from pilot programs to production environments, solutions that operate quietly inside established systems—rather than demanding attention—are likely to win.

The “zero-click” approach signals a shift in medical AI maturity. Instead of asking clinicians to adapt to software, vendors are adapting software to clinicians.

If this model scales beyond early adopters, it could set a new baseline for how imaging AI is delivered in U.S. radiology practices.

 

And in a field where workflow efficiency and diagnostic precision are both mission-critical, invisible intelligence may be the smartest innovation of all.

Get in touch with our MarTech Experts.

BMC and AWS Ink 5-Year Deal to Supercharge Control-M SaaS with GenAI and Unified Orchestration

BMC and AWS Ink 5-Year Deal to Supercharge Control-M SaaS with GenAI and Unified Orchestration

marketing 13 Feb 2026

BMC is doubling down on cloud—and on Amazon. The enterprise software veteran has signed a five-year strategic collaboration agreement (SCA) with Amazon Web Services to expand how enterprises orchestrate application workflows and data pipelines at scale.

At the center of the deal is BMC’s Control-M SaaS platform, which will now run on AWS as its preferred cloud provider. The move formalizes a long-standing relationship and positions Control-M as a cloud-native orchestration layer for hybrid, data, and AI-driven enterprises.

For CIOs and data leaders grappling with sprawling multi-cloud environments and AI initiatives, the message is clear: BMC wants to be the automation backbone that connects it all—now with deeper AWS integration and generative AI baked in.

What’s New: GenAI and Agentic AI Meet Enterprise Orchestration

The headline innovation isn’t just infrastructure. It’s intelligence.

Through the partnership, BMC is embedding its generative AI capabilities—most notably Jett, its AI-powered advisor—into Control-M running on AWS. Jett delivers intelligent guidance, automated insights, and context-aware recommendations designed to help teams modernize faster across AWS environments.

BMC is also pushing into agentic AI, positioning Control-M as more than a job scheduler. The company aims to deliver native AWS-based automation that can autonomously manage data pipelines and workflows across hybrid and cloud systems.

For organizations building machine learning pipelines in services like Amazon SageMaker, the integration promises out-of-the-box orchestration that ties together data prep, model training, deployment, and monitoring. That’s a practical win for enterprises trying to move AI projects from pilot to production without creating new silos.

Why It Matters: Orchestration Is the New Control Plane

As enterprises scale AI and data initiatives, orchestration has quietly become mission-critical. Modern architectures often span on-prem infrastructure, multiple clouds, containers, and managed AI services. Without a unified layer to coordinate workflows, complexity spirals fast.

That’s where Control-M fits in.

Already available in AWS Marketplace, the platform enables end-to-end orchestration of data pipelines across hybrid environments. With this expanded collaboration, BMC is betting that AWS customers want a single control plane to manage everything from legacy workloads to containerized microservices to AI jobs.

The timing aligns with broader industry shifts. Enterprises are accelerating cloud modernization while contending with governance, data residency, and compliance demands. BMC recently expanded Control-M SaaS availability to the AWS Sydney Region, alongside deployments in Ireland, Canada, and the U.S.—a nod to rising demand for localized data processing and resiliency.

Deep AWS Integration Across Services

The collaboration extends beyond hosting. BMC continues to roll out monthly integrations with AWS services, including:

  • Amazon Athena

  • Amazon Bedrock

  • Amazon Elastic Container Service (ECS)

  • AWS CloudFormation

  • AWS Mainframe Modernization

  • Amazon SageMaker

This expanding ecosystem suggests a deliberate strategy: make Control-M the connective tissue between AWS-native services and legacy enterprise systems.

For customers like Air Europa, the appeal is immediate integration with tools such as Amazon SageMaker to support data, machine learning, and AI roadmaps. The orchestration layer becomes the operational glue that ties data strategy to execution.

Competitive Context: The Automation Arms Race

BMC’s move comes amid intensifying competition in automation and workload orchestration. Vendors across the IT operations, DevOps, and data engineering spectrum are racing to embed AI into workflow management. Meanwhile, hyperscalers like AWS are expanding their own native orchestration capabilities.

By aligning closely with AWS rather than competing head-on, BMC is taking a pragmatic route. It strengthens its relevance inside the AWS ecosystem while differentiating through enterprise-grade automation depth and hybrid support—areas where many cloud-native tools still lag.

The five-year term also signals commitment. Strategic collaboration agreements with AWS typically involve joint go-to-market efforts, co-innovation, and tighter technical alignment. For BMC, that means greater visibility inside one of the world’s largest cloud marketplaces.

The Bigger Picture: Modernization Without Disruption

For enterprises, modernization isn’t just about moving to the cloud. It’s about orchestrating legacy systems, data platforms, AI workloads, and emerging services without breaking what already works.

By making AWS its preferred cloud for Control-M SaaS, BMC is betting that customers want flexibility without fragmentation. The promise: unified orchestration across hybrid, cloud, data, and AI workloads—paired with AI-driven insights to reduce operational drag.

If BMC delivers, Control-M could evolve from a trusted scheduling platform into a strategic AI-era control layer for enterprise operations.

In a market obsessed with generative AI headlines, this deal is less about flashy demos and more about plumbing—the kind that determines whether digital transformation efforts actually scale.

 

And in enterprise IT, the plumbing is where the real battles are won.

Get in touch with our MarTech Experts.

Black Duck Polaris Expands Native SCM Integrations to Secure AI-Generated Code at Enterprise Scale

Black Duck Polaris Expands Native SCM Integrations to Secure AI-Generated Code at Enterprise Scale

artificial intelligence 13 Feb 2026

Application security is colliding with a new reality: thousands of repositories, globally distributed teams, and a surge of AI-generated code. Today, Black Duck is responding with a major update to its Polaris platform, rolling out enhanced, native integrations across all major source code management (SCM) systems.

The upgraded Black Duck Polaris Platform now delivers built-in integrations with GitHub, GitLab, Azure DevOps, and Bitbucket—not as bolted-on scripts, but as natively engineered connections designed for enterprise scale.

In an era when code is written by both humans and machines, Black Duck is making a clear bet: security has to move at the speed of development, or it becomes irrelevant.

What’s New: Native, Automated, and AI-Aware AppSec

Polaris has long combined static application security testing (SAST), software composition analysis (SCA), and dynamic application security testing (DAST) in a SaaS model. What’s new here is the depth of automation and orchestration across SCM environments.

The enhanced integrations introduce:

  • Instant onboarding for thousands of repositories without manual setup

  • Continuous synchronization as repos are renamed, branched, or created

  • Automated scan triggers on pull request creation, updates, and pre-merge events

  • Single-click policy enforcement across large repo estates

  • Automatic user and role synchronization

For organizations managing hundreds—or thousands—of repositories, that shift matters. Manual onboarding and piecemeal security tools often lead to blind spots. Polaris now aims to eliminate those gaps by auto-detecting changes across SCM systems and maintaining continuous coverage.

In practical terms, security scans can now trigger automatically during the pull request process, embedding vulnerability detection directly into code review workflows. Developers see findings inside the pull request itself, reducing the need to switch tools or escalate late-stage issues.

That’s DevSecOps without the “Sec” slowing things down.

AI-Powered Security for AI-Generated Code

The rise of generative coding tools has fundamentally changed the attack surface. Enterprises are now grappling with code that may be syntactically correct but security-naïve—or worse, subtly flawed at scale.

Black Duck is leaning into AI to counter AI.

Through Black Duck Signal, organizations can run AI-powered scans directly in the IDE or through CI/CD pipelines, all centrally managed in Polaris. Signal is designed to surface meaningful security insights in both human- and AI-generated code, before it ever makes it into production.

Meanwhile, Code Sight extends that coverage directly into the developer’s desktop environment. It triggers Polaris scans in real time while coding, and when combined with Black Duck Assist’s AI-driven remediation guidance, offers contextual fixes instead of abstract vulnerability reports.

The goal: catch vulnerabilities before commit, not after deployment.

In a market crowded with AI security claims, the differentiator here is workflow placement. Black Duck isn’t just adding AI to dashboards—it’s embedding intelligence at the precise points where code changes happen.

Customizable Scanning for Modern Development Cadence

Another key addition is flexible scanning depth. Teams can opt for:

  • Full, deep analysis for comprehensive security checks

  • Rapid analysis for ultra-fast feedback in high-velocity workflows

This dual-mode capability reflects a broader industry trend: security must adapt to different pipeline contexts. A hotfix merge doesn’t require the same scanning depth as a major release candidate. Polaris now allows enterprises to tailor scanning to the moment, balancing speed with rigor.

Why It Matters: Scaling DevSecOps Without Adding Friction

Enterprise software development has become massively distributed. Teams are global. Repositories multiply quickly. AI accelerates output. But security headcount doesn’t scale linearly.

That imbalance creates risk.

Black Duck’s enhanced SCM integrations aim to solve the operational bottleneck: instead of manually onboarding projects and enforcing policies repo by repo, organizations can automate coverage across their entire SCM footprint.

The company claims no other solution combines this breadth of SCM support with universal event- and policy-based automation, alongside AI-powered depth of analysis.

While competitors in the AppSec space are increasingly emphasizing platform consolidation and AI assistance, Polaris positions itself as both comprehensive and workflow-native. The strategy reflects a growing realization in the industry: fragmented security tools don’t just slow teams—they create coverage gaps attackers exploit.

The Bigger Picture: AppSec in the Age of Code Explosion

Software supply chains are expanding rapidly. Microservices, third-party libraries, and AI-generated snippets have made applications more modular—and more vulnerable.

At the same time, enterprises are racing to operationalize AI, often across sprawling codebases managed in mixed SCM environments. Security leaders are under pressure to ensure policy consistency across GitHub, GitLab, Azure DevOps, and Bitbucket simultaneously.

By offering unified, automated coverage across all four major SCM platforms, Black Duck is targeting that exact pain point.

The result isn’t just tighter integration. It’s an attempt to make security ambient—always present, always synchronized, and invisible until needed.

If Polaris delivers on its promise, enterprises may finally be able to scale DevSecOps without scaling friction alongside it.

Get in touch with our MarTech Experts.

Feedzai and Neterium Unite to Deliver Real-Time, AI-Driven Financial Crime Screening

Feedzai and Neterium Unite to Deliver Real-Time, AI-Driven Financial Crime Screening

marketing 13 Feb 2026

Financial crime compliance is getting more complex—and more expensive. In response, Feedzai and Neterium have announced a strategic partnership aimed at consolidating watchlist and transaction screening into a single, AI-powered platform.

The deal embeds Neterium’s cloud-native screening technology directly into Feedzai’s RiskOps platform, expanding its Watchlist Screening solution with newly launched Transaction Screening capabilities. The result: a unified AML and sanctions screening engine built for real-time payments and modern compliance demands.

For financial institutions juggling fragmented compliance stacks and mounting regulatory scrutiny, the message is simple—fewer integrations, faster deployment, and smarter detection.

What’s New: Real-Time Transaction Screening Meets Watchlist Intelligence

Feedzai’s Watchlist Screening solution already delivers API-driven, ultra-low-latency compliance checks. With Neterium’s advanced algorithmic matching now integrated, the platform extends beyond static name checks to dynamic transaction screening in real time.

That matters in the instant payments era. As funds move in seconds, compliance checks must keep pace without slowing down the customer experience.

The upgraded platform promises:

  • Frictionless real-time processing that scales to peak transaction volumes

  • AI-driven holistic matching to reduce false positives

  • Automated global sanctions updates to maintain regulatory accuracy

  • Explainable decisioning and audit-ready reporting

  • Integrated fraud and AML insights across Feedzai’s broader suite

By embedding Neterium’s infrastructure directly into its financial crime prevention stack, Feedzai is positioning itself as a single control layer for sanctions screening, transaction monitoring, and fraud prevention.

Why It Matters: False Positives Are Draining Compliance Teams

For banks and fintechs, false positives aren’t just an annoyance—they’re a cost center. Analysts spend hours clearing alerts that pose no real threat, while true risks can slip through fragmented systems.

Neterium’s algorithmic matching is designed to cut through that noise. Smarter entity resolution and contextual screening aim to reduce unnecessary alerts while improving detection precision.

Equally important is transparency. Regulators increasingly demand explainable AI models and detailed audit trails. Feedzai says the unified platform delivers end-to-end visibility and compliance-ready reporting, addressing both operational efficiency and regulatory defensibility.

At a time when global sanctions lists evolve rapidly and regulatory bodies tighten expectations, automated, real-time data updates eliminate manual list management—a persistent pain point for compliance teams.

A Strategic Bet on Platform Consolidation

The partnership reflects a broader market shift toward consolidation in RegTech and financial crime prevention. Institutions are under pressure to simplify their tech stacks while maintaining comprehensive coverage across fraud, AML, sanctions, and transaction screening.

Rather than building from scratch, Feedzai is extending its capabilities through embedded infrastructure—folding Neterium’s cloud-native screening engine into its RiskOps architecture.

For Neterium, the deal expands reach into Feedzai’s global banking customer base. For Feedzai, it strengthens its claim as an AI-native, end-to-end financial crime platform.

The timing is notable. Instant payments, cross-border transfers, and digital banking growth have expanded both transaction volumes and exposure to sanctions risk. Regulators expect faster detection with fewer errors—an increasingly difficult balance to strike.

The Bigger Picture: Compliance at Machine Speed

Financial crime prevention is moving toward continuous, real-time decisioning powered by AI. Static batch screening models no longer suffice in an ecosystem defined by instant payments and embedded finance.

By integrating transaction screening directly into its platform, Feedzai is aiming to align compliance with transaction velocity. The promise isn’t just faster checks—it’s smarter, explainable risk decisions that reduce friction for legitimate customers.

If successful, the collaboration could signal a new baseline for compliance platforms: unified screening, integrated fraud insights, and AI-driven matching—all delivered through a single API-powered ecosystem.

 

For compliance leaders navigating escalating risk and shrinking operational tolerance for inefficiency, that consolidation may be more than convenient. It may be necessary.

Get in touch with our MarTech Experts.

   

Page 17 of 1427

REQUEST PROPOSAL