At Red Hat Summit 2025, we introduced Ask Red Hat, a conversational AI designed to be an intelligent front door for our customers. It began as a rapid 12-week build to prove that open source AI could transform the support experience.
Today, Ask Red Hat has evolved from a proof of concept into a sophisticated production reality. As of late 2025, it has served over 50,000 unique users and handled more than 450,000 messages. It is no longer just a standalone tool but a cross-product orchestration layer, integrated directly into docs.redhat.com and new support case creation in the Red Hat Customer Portal. Ask Red Hat is delivering on its goal: Helping customers solve problems the very moment they arise.
The technical engine: Granite and the guardrails of trust
While much of the industry is still experimenting with models in isolation, Red Hat has operationalized the IBM Granite family to solve specific customer pain points. Our leadership in this space is defined by how we balance power with precision:
- The model: We use Granite-3.x 8B-Instruct variant for its high efficiency and precision in RAG (retrieval-augmented generation), with plans to migrate to Granite-4.x Small in the near future.
- The safety net: We’ve implemented a robust safety architecture using Granite Guardian models to serve as "guardrails." These evaluate both user inputs and AI outputs in real-time to prevent jailbreaks and help keep the assistant on-task.
- Precision retrieval: In December 2025, we launched a retrieval optimization phase that resulted in a 45% MRR (mean reciprocal rank) improvement, helping customers get version-appropriate documentation for products every time they’re requested.
Measuring the unseen: The value of AI and Ask Red Hat
The biggest question for the enterprise today is "how do we measure AI success?" As the first step toward an intelligent front door, Ask Red Hat has an impact across almost every team at Red Hat in one way or another. Our original founding objective was to help customers self-solve whenever possible.
To further highlight this goal, our lead back end engineer worked in Red Hat technical support at the start of his career, and once said, “If I can solve a problem myself in five minutes I’d rather do that than wait for Support to call me back, even if it’s only an hour.” If we can help customers when they would otherwise have had to make a support case, they are happy, and we can measure that happiness as cost avoidance within Red Hat Support.
We use a specific internal framework to calculate this success. By measuring instances where a user engages with Ask Red Hat and then chooses not to open a support case, we can quantify the business impact.
The Red Hat cost avoidance formula:
- Total cost avoidance = count of potential cases successfully resolved through customer self-solves using Ask Red Hat x calculated cost of a support case*
*While specific internal costs remain proprietary, this formula allows us to track the financial resilience provided by AI
In 2025 alone, Ask Red Hat contributed approximately $1.5M in cost avoidance through technical support case creation alone. Each of these cases represents a time where a customer or partner was able to take a swing at solving a problem themselves more quickly, without having to wait for even Red Hat’s fast support response times, because Ask Red Hat gave them what they needed. This proves that enterprise AI is not just an efficiency play; it is a foundational shift in how we deliver value across the customer experience, namely by: improving customer time-to-resolution while reducing the load on our support engineers.
Looking ahead: The roadmap to agentic AI
We are rapidly moving from an assistant that simply answers to an agent that acts. Our upcoming roadmap focuses on making the customer experience even easier:
- Single point of entry orchestration layer : We are moving toward a multi-agent experience where Ask Red Hat can coordinate between different specialized AI tools to solve complex, multi-step issues.
- Nearly real-time RAG: We have already optimized our pipeline so that documentation updates are reflected in Ask Red Hat in under four minutes, keeping the AI tool consistently aligned to the latest source of truth.
By grounding our AI strategy in open source, transparent ROI, and rigorous safety guardrails, Red Hat has successfully moved past the experimental phase of AI and into a future of documented customer success.
You can learn more about the Ask Red Hat technical details on our AI system card, as well the Ask Red Hat solution architecture on the Red Hat Architecture Center.
저자 소개
Matt Ruzicka is a Portfolio Content Strategist for Red Hat’s Intelligent Experience Delivery team with over a decade of experience at Red Hat—spanning AI solutions, customer success, and Technical Account Management. They focus on bridging the gap between complex technology and meaningful customer outcomes.
유사한 검색 결과
Why the future of AI depends on a portable, open PyTorch ecosystem
Scaling Earth and space AI models with Red Hat AI Inference Server and Red Hat OpenShift AI
Technically Speaking | Build a production-ready AI toolbox
Technically Speaking | Platform engineering for AI agents
채널별 검색
오토메이션
기술, 팀, 인프라를 위한 IT 자동화 최신 동향
인공지능
고객이 어디서나 AI 워크로드를 실행할 수 있도록 지원하는 플랫폼 업데이트
오픈 하이브리드 클라우드
하이브리드 클라우드로 더욱 유연한 미래를 구축하는 방법을 알아보세요
보안
환경과 기술 전반에 걸쳐 리스크를 감소하는 방법에 대한 최신 정보
엣지 컴퓨팅
엣지에서의 운영을 단순화하는 플랫폼 업데이트
인프라
세계적으로 인정받은 기업용 Linux 플랫폼에 대한 최신 정보
애플리케이션
복잡한 애플리케이션에 대한 솔루션 더 보기
가상화
온프레미스와 클라우드 환경에서 워크로드를 유연하게 운영하기 위한 엔터프라이즈 가상화의 미래