Understanding AI
Artificial intelligence (AI) refers to computer science processes and statistical algorithms that simulate and augment human intelligence.
Red Hat AI Enterprise
Featured
What is vLLM?
What is sovereign AI?
Red Hat Launches Red Hat AI Enterprise to Deliver a Unified AI Platform that Spans from Metal to Agents
Foundations of AI
What is machine learning?
What is deep learning?
What are foundation models for AI?
What are large language models?
SLMs vs LLMs: What are small language models?
What is AI inference?
AI infrastructure explained
What is an AI platform?
Types of AI
What is generative AI?
Predictive AI vs. generative AI
What is agentic AI?
Agentic AI vs. generative AI
Model enhancements
What is retrieval-augmented generation?
RAG vs. fine-tuning
What is parameter-efficient fine-tuning (PEFT)?
LoRA vs. QLoRA
What is InstructLab?
What is vLLM?
What is Model Context Protocol (MCP)?
What is Model-as-a-Service?
What are Granite models?
AI at scale
What is sovereign AI?
What is llm-d?
What is distributed inference?
What is enterprise AI?
What is edge AI?
What is MLOps?
What is LLMOps?
AIOps explained
What is AI security?
Understanding AI/ML use cases
What is AI in healthcare?
AI in banking
Understanding AI in telecommunications
Red Hat AI use cases
Generative AI
Produce new content, like text and software code.
Red Hat AI lets you run the generative AI models of your choice, faster, with fewer resources, and lower inference costs.
Predictive AI
Connect patterns and forecast future outcomes.
With Red Hat AI, organizations can build, train, serve and monitor predictive models, all while maintaining consistency across the hybrid cloud.
Operationalized AI
Create systems that support the maintenance and deployment of AI at scale.
With Red Hat AI, manage and monitor the lifecycle of AI-enabled applications while saving on resources and ensuring compliance with privacy regulations.
Agentic AI
Build workflows that perform complex tasks with limited supervision.
Red Hat AI provides a flexible approach and stable foundation for building, managing and deploying agentic AI workflows within existing applications.
Red Hat AI portfolio
Scale your AI foundation
- Customize models with control.
- Optimize resource allocation.
Optimize model performance
- Fast inference at scale.
- Powered by vLLM.
Build and deploy AI applications
- Manage the full AI lifecycle.
- Implement AI guardrails.
Run LLMs on an individual server
- Develop, test, and run gen AI.
- Fast, flexible inference.
AI customer stories from Red Hat Summit and AnsibleFest 2025
Turkish Airlines doubled the speed of deployment times with organization-wide data access.
JCCM improved the region's environmental impact assessment (EIA) processes using AI.
Denizbank sped up time to market from days to minutes.
Hitachi operationalized AI across its entire business with Red Hat AI.