New Revefi Research Reveals That Bad Data Is Wreaking Havoc for Businesses | Martech Edge | Best News on Marketing and Technology
New Revefi Research Reveals That Bad Data Is Wreaking Havoc for Businesses

reports

New Revefi Research Reveals That Bad Data Is Wreaking Havoc for Businesses

New Revefi Research Reveals That Bad Data Is Wreaking Havoc for Businesses

Business Wire

Published on : Oct 19, 2023

Survey shows why data teams need a trusted copilot to manage data quality, performance, usage, costs

New research from Revefi reveals that bad data costs companies significantly and in more ways than one. Forty percent of the more than 300 IT directors, data and analytics managers, and other IT professionals surveyed said they encounter 11 to 100 data incidents per month. Sixty-five percent of the group said bad data delays processes. More than 75% said that it is somewhat, very, or extremely difficult to manage data warehouse spend, which is especially problematic right now as companies are working to do more with less.

“Data quality and management issues are on the rise, and that’s a costly problem for businesses,” said Sanjay Agrawal, CEO and co-founder of Revefi. “It leaves them spending too much time manually identifying root causes of data issues, and it creates delays, wastes money, leads to poor decision-making, and reduces customer trust and the accuracy of AI models.”

“Every business wants to be data-driven, yet there’s a lack of trust in data,” added Shashank Gupta, CTO and co-founder of Revefi. “Data teams typically lack the tools that they need to understand if they have a data problem and identify root causes quickly to fix issues and move their businesses forward. I witnessed this struggle firsthand during my time on Meta’s data infrastructure team. That led Sanjay and me to start Revefi and launch Revefi Data Operations Cloud, a zero-touch platform to monitor and manage data quality, performance and costs.”

Organizations across sectors spend far too much time finding and resolving data incidents

More than a quarter of those surveyed said detecting most data incidents takes up to eight hours. A tenth of the group said identification can take days or even more than a week. In addition to the time it requires, manually identifying problems is also enormously resource-intensive, as finding root cause often requires the involvement of multiple people across teams.

That’s just uncovering the source of the problem. Then, you need to fix it. Yet 43% of survey respondents said it takes more than 48 hours to resolve a data incident after discovering it.

Half of the survey respondents from the manufacturing sector and 60% of IT professionals in education admitted that it takes them more than 48 hours to resolve data incidents. A full 100% of respondents in energy, oil & gas said they typically must dedicate more than 48 hours to resolve a situation stemming from bad data.

Bad data and other data challenges like cost management can have adverse consequences

Fifty-eight percent of the IT professionals surveyed said that data quality and cleanliness are the most significant challenges that they face when working with data. Nearly as many (57%) respondents in the survey group revealed that they have encountered inaccurate data.

The same share (57%) said bad data has led to poor decision-making. Nearly as many (56%) said they believe that bad data reduces the accuracy of AI model performance. That is concerning considering the very high usage of AI models that has occurred in recent months.

Half of the survey respondents said that managing their data warehouse spend is difficult. Bad data also can erode the data users’ trust and work against company efforts to build a data culture. Indeed, 40% of respondents said that they believe bad data reduces customer trust.

Data quality is critical to AI model training and to ensure ethical AI development

As the adoption of AI grows and more organizations rely on AI models to automate more decisions and processes, the need for high-quality data takes on even greater importance.

That said, it’s troubling that 43% of respondents said they have experienced negative consequences due to poor data quality in AI projects. It’s also concerning that more than half (52%) of IT pros only somewhat trust the data sources that are being used to train AI models.

But there’s also some good news here. A whopping 70% of IT professionals believe that addressing data quality issues is important from an ethical standpoint in AI development.

With a copilot, data teams can manage data quality, performance, usage and costs

The better news is that technology is now available to help data teams manage data quality, performance, usage and costs. Revefi Data Operations Cloud is a copilot that empowers data teams to get the right data in their cloud data warehouses reliably, promptly and affordably.

For example, Revefi Data Operations Cloud gives FCP Euro a granular, accurate understanding of its data platforms so it can see how teams consume data and where they might be missing certain fields or values that could add insight. That allows the e-commerce company to know what its teams need from data so it can focus its energy and investment on delivering that. Revefi Data Operations Cloud also provides FCP Euro better insight for cost management, easy-to-understand data quality alerts and a sound foundation for data governance.

Revefi Data Operations Cloud is also helping ThoughtSpot move its business forward. ThoughtSpot now enjoys data freshness; automatic, instant access to data lineage; greater user trust in data; and cost and spend management. Within weeks of adopting Revefi, ThoughtSpot reduced its cloud data platform costs by 30% while increasing its platform usage by 35%.

Another customer, Uplimit, has achieved data reliability at scale with Revefi. The AI education platform company turned to Revefi to ensure a high standard of quality for the data it uses. With Revefi, Uplimit enjoys the benefits of data quality monitoring without a dedicated resource, silent failure detection, stringent control over data access to ensure SOC 2 compliance and the ability to find unused data resources so that it can identify cost savings opportunities.