Contextual AI Partners with WEKA to Accelerate Next-Gen Enterprise AI Models | Martech Edge | Best News on Marketing and Technology
GFG image
Contextual AI Partners with WEKA to Accelerate Next-Gen Enterprise AI Models

technology artificial intelligence

Contextual AI Partners with WEKA to Accelerate Next-Gen Enterprise AI Models

Contextual AI Partners with WEKA to Accelerate Next-Gen Enterprise AI Models

PR Newswire

Published on : Aug 8, 2024

WekaIO (WEKA), an AI-native data platform leader, has announced a collaboration with Contextual AI, a company revolutionizing enterprise AI applications through advanced Contextual Language Models (CLMs). Contextual AI’s CLMs, trained using the proprietary RAG 2.0 approach, now leverage the power of the WEKA® Data Platform to enhance performance and reliability for Fortune 500 enterprises.

  • Overview of Contextual AI and RAG 2.0:

    • Founded in 2023, Contextual AI specializes in building enterprise AI applications using RAG 2.0, an advanced retrieval-augmented generation approach.
    • Unlike traditional models, RAG 2.0 integrates embedding, retrieval, and generation into a single system, ensuring higher accuracy, compliance, and attribution to source documents.
  • Challenges in AI Model Development:

    • Generative AI workloads demand significant computational power and efficient data management, leading to bottlenecks in GPU utilization and prolonged training times.
    • Contextual AI initially faced performance issues and scale challenges, hindering its ability to train AI models effectively.
  • The Role of WEKA Data Platform:

    • Optimizing GPU Utilization:
      • The WEKA Data Platform's AI-native architecture accelerates data pipelines, saturating GPUs with data, ensuring peak efficiency in AI workloads.
      • Cloud-agnostic and hardware-independent, WEKA's platform supports diverse AI workload profiles, handling complex metadata operations and large-scale data writes.
    • Deployment on Google Cloud:
      • Contextual AI deployed WEKA on Google Cloud, managing 100TBs of datasets, significantly improving data performance and accelerating model training times.
    • Enhanced Data Management:
      • WEKA provided seamless metadata handling, checkpointing, and preprocessing, eliminating bottlenecks and improving overall GPU utilization, leading to cost savings.
  • Key Outcomes Achieved:

    • 3x Performance Improvements:
      • The WEKA platform enabled a threefold increase in GPU utilization, enhancing key AI use cases.
    • 4x Faster Model Checkpointing:
      • Checkpointing processes saw a fourfold speed increase, dramatically improving developer productivity.
    • 38% Cost Reduction:
      • Cloud storage costs per terabyte were reduced by 38%, contributing to significant operational savings.
  • Statements from Leadership:

    • Amanpreet Singh, CTO & Co-founder of Contextual AI:
      • Praised the WEKA platform for transforming fast, ephemeral storage into persistent, affordable data, essential for scaling AI solutions.
    • Jonathan Martin, President at WEKA:
      • Highlighted WEKA’s role in helping Contextual AI overcome data management challenges, accelerating the development of trustworthy, next-gen AI models.

The partnership between Contextual AI and WEKA exemplifies the cutting-edge advancements driving the AI revolution. By leveraging WEKA's AI-native data platform, Contextual AI has successfully optimized GPU utilization, accelerated model training, and reduced costs, positioning itself at the forefront of enterprise AI innovation.