The team launched a new innovative AI-driven program designed to embrace flexibility and an experimental mindset to ensure success. They created a ‘betmap’ - an iterative experiment-based alternative to a traditional roadmap. “For our initial foray into AI for support, we selected use cases with structured or semi-structured data to maximize success, such as knowledge-centered service (KCS) solution content, which already use industry-standard formats and avoid privacy concerns,” said Manikandan Sivanesan, AI Technical Strategy Lead, Red Hat Experience Engineering.
The program focused on 4 key projects using the Red Hat AI portfolio: enhancing the Red Hat Customer Portal Case Management Troubleshooting tool, creating an AI summarization model to generate concise summaries of solutions, developing an AI-powered summarization tool to support case handovers, and an AI-powered tool to automate the creation of standardized knowledge base articles.
The team initially experimented with AI models on Red Hat Enterprise Linux AI, which supports the development of Granite family large language models (LLMs) for enterprise applications. Validated applications were then scaled and deployed on Red Hat OpenShift AI on AWS, an enterprise-ready AI and machine learning (ML) platform designed for hybrid cloud environments.
The first project enhanced the Red Hat Customer Portal Case Management Troubleshooting tool, optimizing the order of knowledge articles provided to users based on relevance to issue descriptions. The XE team, working in partnership with IBM Research and Red Hat IT, developed an AI-powered textual reranking solution, achieved using a fine-tuned IBM Slate Retriever model running on Red Hat IT’s OpenShift AI cluster, with GPU acceleration.
With approximately 130,000 knowledge base solutions, many previously undiscoverable, the team created an AI summarization model to generate accurate knowledge base snippets. Running on IBM watsonx.ai and OpenShift AI’s data science pipelines, the model generates concise summaries of solutions.
High-severity support cases often require handovers across associates, time zones, and teams. To streamline these transitions, the team developed an AI-powered summarization tool that generates concise summaries of support case interactions on demand. Initially designed for 24x7 ‘follow the sun’ cases, it has now been expanded to additional support groups. Built using Mistral LLMs on Red Hat IT’s OpenShift AI cluster with GPU acceleration, the team is now testing Granite models and exploring further fine-tuning with InstructLab – an open-source project by IBM and Red Hat.
The team also piloted an AI-powered tool to automate the creation of standardized knowledge base articles, aligning with KCS v6 standards: “Our tool uses support case comments to draft articles quickly and consistently, reducing reliance on manual efforts,” said Mike Clark, Senior Manager, Software Engineering, Red Hat Experience Engineering. “Built using Mistral LLMs and GPU acceleration on OpenShift AI, this solution enhances self-service capabilities, allowing support associates to focus on solving new issues.”