Red Hat Enterprise Linux AI

Red Hat® Enterprise Linux® AI is a platform for running large language models (LLMs) in individual server environments. The solution includes Red Hat AI Inference, an end-to-end stack that provides fast, consistent, and cost-effective inference across the hybrid cloud. 

RHELAI interface graphic

What is Red Hat Enterprise Linux AI?

Red Hat Enterprise Linux AI brings together:

  • Red Hat AI Inference, which provides the operational control to run any model on any accelerator across the hybrid cloud.
  • A bootable image of Red Hat Enterprise Linux, including popular AI libraries such as PyTorch and hardware-optimized inference for NVIDIA, Intel, and AMD.
  • Enterprise-grade technical support and Open Source Assurance legal protections. 

Fast, flexible inference

When it comes to implementing generative AI, you have 2 choices: adapt your strategy to fit a pre-built product, or engineer a custom solution that aligns directly with your business goals. 

Rooted in open source, Red Hat Enterprise Linux AI provides safe-guarded and reliable infrastructure to run any model, on any hardware, with any accelerator, across the hybrid cloud. The solution includes Red Hat AI Inference, which offers a reliable foundation for both agentic AI and internal Model-as-a-Service patterns. With integrated vLLM runtime and LLM compressor, you can maximize throughput and minimize latency for fast and cost-effective model deployments. By packaging the operating system and the inference server, you can begin serving models of your choice in the environment that’s best for you.

Features and benefits

Icon illustration of open source bubbles

LLMs for the enterprise

Tune smaller, purpose-built models with your own data using methods like retrieval-augmented generation (RAG).

Increased efficiency

Powered by vLLM, Red Hat AI Inference increases efficiency without sacrificing performance. 

Icon illustration of a diagram signifying simplify

Cloud-native scalability

Red Hat Enterprise Linux image mode lets you manage your AI platform as a container image, streamlining your approach to scaling.

233% ROI with Red Hat AI

Red Hat commissioned Forrester Consulting to conduct a Total Economic Impact™ (TEI) study and examine the potential return on investment (ROI) enterprises may realize by deploying Red Hat AI. 

After interviewing Red Hat AI customers, the analysis found that a composite organization realized an ROI of 233% over 3 years, representing a total value of more than triple their initial investment.

Why choose Red Hat AI?

Build on a trusted foundation that supports any model and any agent on any hardware accelerator—across the hybrid cloud. Red Hat AI gives organizations the freedom to deploy where their data, compliance, and cost requirements demand.

Inference

Manage model complexity with fast, efficient inference powered by vLLM and the control to run any model on any accelerator across the hybrid cloud.

Data

Customize domain-specific agentic AI use cases with models connected to your organization’s own private data.

Agents

Simplify and accelerate your journey to successful agentic AI adoption with governance and control.

Platform

Deploy resilient, trustworthy AI solutions on a foundation of open source transparency and hybrid cloud scalability.

Deploy with partners

Experts and technologies are coming together so our customers can do more with AI. Explore all of the partners working with Red Hat to certify their operability with our solutions.

Dell Technologies Logo
Cisco logo in a square outline
Intel logo
Nvidia Logo
AMD logo

AI customer stories from Red Hat Summit and AnsibleFest 2025

Turkish Airlines

Turkish Airlines doubled the speed of deployment times with organization-wide data access.

JCCM Logo

JCCM improved the region's environmental impact assessment (EIA) processes using AI.

DenizBank

Denizbank sped up time to market from days to minutes.

Hitachi logo

Hitachi operationalized AI across its entire business with Red Hat AI.

Frequently asked questions

What is the difference between Red Hat Enterprise Linux AI and Red Hat Enterprise Linux?

Red Hat Enterprise Linux AI is a foundation model platform for running LLMs in individual server environments. The solution includes Red Hat AI Inference, which delivers fast, cost-effective hybrid cloud inference by maximizing throughput, minimizing latency, and reducing compute costs.

Red Hat Enterprise Linux is a commercial open source Linux distribution developed by Red Hat for the commercial market that provides a flexible and stable foundation to support hybrid cloud innovation.

Red Hat Enterprise Linux AI is delivered as a Red Hat Enterprise Linux bootable image that includes AI libraries, and Granite models.

Do I need to buy Red Hat Enterprise Linux to use Red Hat Enterprise Linux AI?

No, a Red Hat Enterprise Linux AI license is sufficient and includes all of the components needed.

What’s included in Red Hat Enterprise Linux AI?

Red Hat Enterprise Linux AI includes a bootable image of a Red Hat Enterprise Linux container image that includes:

What’s the difference between Red Hat Enterprise Linux AI and Red Hat OpenShift AI?

Red Hat Enterprise Linux AI provides out-of-the-box large language models in a single server. The solution includes Red Hat AI Inference, which provides an immutable, purpose-built appliance for inference. By packaging the operating system (OS) and application together, Red Hat Enterprise Linux AI facilitates day-one operations for AI inference across the hybrid cloud by maximizing latency, and reducing compute costs.

Red Hat OpenShift® AI provides all of the tools needed to help customers build AI-enabled applications at scale. Red Hat OpenShift AI offers a comprehensive, integrated MLOps platform to help manage the lifecycle of models, ensuring support for distributed compute, collaboration workflows, monitoring, and hybrid-cloud applications.

Red Hat OpenShift AI includes access to Red Hat Enterprise Linux AI, so teams can use the same models and alignment tools in their OpenShift AI architecture as well as benefit from additional enterprise MLOps capabilities.

How is Red Hat Enterprise Linux AI priced?

The Red Hat Enterprise Linux AI license is priced per accelerator.

Explore more AI resources

4 considerations for choosing the right AI model

Maximize AI innovation with open source models

4 reasons to use open source small language models

Unlock generative AI innovation with Red Hat AI

Contact Sales

Talk to a Red Hatter

Reach out to our sales team below for Red Hat Enterprise Linux AI pricing information. 
To learn more about our partnerships, visit our catalog page.

 

1Forrester Consulting study, commissioned by Red Hat. “Forrester Total Economic Impact™ Of Red Hat AI." February 2026.