Cloudera Unveils AI Inference Service with Embedded NVIDIA NIM Microservices to Accelerate GenAI ...
Cloudera, the
only true hybrid platform for data, analytics, and AI, launched Cloudera AI
Inference powered by NVIDIA NIM microservices, part of the NVIDIA AI Enterprise platform. As one
of the industry’s first AI inference services to provide embedded NIM
microservice capability, Cloudera AI Inference uniquely streamlines the
deployment and management of large-scale AI models, allowing enterprises to
harness their data’s true potential to advance GenAI from pilot phases to full
production.
Recent data from Deloitte reveals the biggest barriers to
GenAI adoption for enterprises are compliance risks and governance concerns,
yet adoption of GenAI is progressing at a rapid pace, with over two-thirds of
organizations increasing their GenAI budgets in Q3 this year. To mitigate these
concerns, businesses must turn to running AI models and applications privately
- whether on premises or in public clouds. This shift requires secure and
scalable solutions that avoid complex, do-it-yourself approaches.
Cloudera AI Inference protects sensitive data from leaking
to non-private, vendor-hosted AI model services by providing secure development
and deployment within enterprise control. Powered by NVIDIA technology, the
service helps to build trusted data for trusted AI with high-performance
speeds, enabling the efficient development of AI-driven chatbots, virtual
assistants, and agentic applications impacting both productivity and new business
growth.
The launch of Cloudera AI Inference comes on the heels of
the company’s
collaboration with NVIDIA, reinforcing Cloudera’s commitment to
driving enterprise AI innovation at a critical moment, as industries navigate
the complexities of digital transformation and AI integration.
Developers can build, customize, and deploy
enterprise-grade LLMs with up to 36x faster performance using NVIDIA Tensor
Core GPUs and nearly 4x throughput compared with CPUs. The
seamless user experience integrates UI and APIs directly with NVIDIA NIM
microservice containers, eliminating the need for command-line interfaces (CLI)
and separate monitoring systems. The service integration with Cloudera’s AI Model
Registry also enhances security and governance by managing
access controls for both model endpoints and operations. Users benefit from a
unified platform where all models—whether LLM deployments or traditional
models—are seamlessly managed under a single service.
Leave A Comment