I’m excited to announce the general availability of Red Hat Enterprise Linux AI (RHEL AI) 1.2, our generative AI (gen AI) foundation model platform to develop, test and run large language models (LLMs) for enterprise applications. RHEL AI combines open source Granite LLMs with InstructLab model alignment tools on a purpose-built RHEL image optimized for gen AI use cases.
Building on the success of our initial release of RHEL AI 1.1 on September 5, 2024, this version furthers our commitment to empowering developers, AI engineers and data scientists by lowering the barriers of entry and adoption to enterprise AI. RHEL AI 1.2 brings significant enhancements, allowing organizations to more efficiently fine-tune and deploy LLMs using private, confidential and sovereign data to better align to enterprise use cases. These improvements, powered by InstructLab and a comprehensive software stack, now support a wider range infrastructure options, including NVIDIA accelerated computing and software and the newly introduced AMD Instinct accelerators. We intend to continue expanding our hardware accelerator support with partners like Intel in upcoming releases.
Key highlights of RHEL AI 1.2:
Support for Lenovo ThinkSystem SR675 V3 servers
RHEL AI 1.2 is now supported on Lenovo ThinkSystem SR675 V3 servers wit NVIDIA accelerated computing. Users can also take advantage of factory preload options for RHEL AI on these servers, making deployment faster and easier.Support for AMD Instinct Accelerators (technology preview)
Language models require powerful computing resources, and RHEL AI now supports AMD Instinct Accelerators with the full ROCm software stack, including drivers, libraries and runtimes. With RHEL AI 1.2, organizations can leverage AMD Instinct MI300x GPUs for both training and inference, and AMD Instinct MI210 GPUs for inference tasks.- Availability on Azure and GCP:
RHEL AI is now available on Azure and Google Cloud Platform (GCP). With this users will be able to download RHEL AI from Red Hat and bring them to Azure and GCP and create RHEL AI based GPU instances. Training checkpoint and resume
Long training runs during model fine tuning can now be saved at regular intervals, thanks to periodic checkpointing. This feature allows InstructLab users to resume training from the last saved checkpoint instead of starting over, saving valuable time and computational resources.Auto-Detection of hardware accelerators
The ilab CLI can now automatically detect the type of hardware accelerator in use and configure the InstructLab pipeline accordingly for optimal performance, reducing the manual setup required.Enhanced training with PyTorch FSDP (technology preview)
For multi-phase training of models with synthetic data, ilab train now uses PyTorch Fully Sharded Data Parallel (FSDP). This dramatically reduces training times by sharding a model’s parameters, gradients and optimizer states across data parallel workers (e.g., GPUs). Users can pick FSDP for their distributed training by using ilab config edit.
These are just a few of the exciting new features in RHEL AI 1.2. Many more improvements and bug fixes are included, making this release a powerful tool for AI development.
Don’t miss out on these powerful new features! Download RHEL AI 1.2 today and deploy it on-premises or across all major public cloud providers and take your AI development to the next level.
Important notice:
With the introduction of RHEL AI 1.2, we will be deprecating support for RHEL AI 1.1 in 30 days. Please ensure your systems are upgraded to RHEL AI 1.2 to continue receiving support.
About the author
More like this
Browse by channel
Automation
The latest on IT automation for tech, teams, and environments
Artificial intelligence
Updates on the platforms that free customers to run AI workloads anywhere
Open hybrid cloud
Explore how we build a more flexible future with hybrid cloud
Security
The latest on how we reduce risks across environments and technologies
Edge computing
Updates on the platforms that simplify operations at the edge
Infrastructure
The latest on the world’s leading enterprise Linux platform
Applications
Inside our solutions to the toughest application challenges
Original shows
Entertaining stories from the makers and leaders in enterprise tech
Products
- Red Hat Enterprise Linux
- Red Hat OpenShift
- Red Hat Ansible Automation Platform
- Cloud services
- See all products
Tools
- Training and certification
- My account
- Customer support
- Developer resources
- Find a partner
- Red Hat Ecosystem Catalog
- Red Hat value calculator
- Documentation
Try, buy, & sell
Communicate
About Red Hat
We’re the world’s leading provider of enterprise open source solutions—including Linux, cloud, container, and Kubernetes. We deliver hardened solutions that make it easier for enterprises to work across platforms and environments, from the core datacenter to the network edge.
Select a language
Red Hat legal and privacy links
- About Red Hat
- Jobs
- Events
- Locations
- Contact Red Hat
- Red Hat Blog
- Diversity, equity, and inclusion
- Cool Stuff Store
- Red Hat Summit