Webinar

Harness the power of AI: The operational consistency and flexibility you need to scale AI across your business and into the future

Jump to section

While Predictive AI has been used by organizations for years, the emergence of GenAI has sparked a renewed focus on how AI can transform every aspect of a business—from reducing costs and providing better customer experiences to delivering better products and services. As AI technology continues to evolve, organizations face challenges integrating new technology with their existing infrastructure, across multiple environments and the need to partner with a wide array of technology partners continues to expand.  The ability to scale across environments and into the future is a necessity.

Red Hat’s strategy for AI is focused on providing the technology that helps simplify the integration of AI into the business by driving operational consistency across teams and providing the flexibility to run AI workloads anywhere.

Join us in this three-part webinar series as we discuss these challenges and share how Red Hat helps organizations by providing the platforms that help simplify the integration of AI into the business by driving operational consistency across teams and providing the flexibility to run AI workloads anywhere.

We encourage you to attend each session to fully understand how each aspect is crucial to a successful AI strategy.

Webinar #1 -Red Hat AI to fuel innovation

Webinar #2 - Simplify AI Adoption for faster time to value

Webinar #3 - Deploying containerized AI-enabled applications

Webinar 1

Red Hat AI to fuel innovation

Generative AI (GenAI) has changed the game in terms of new products and service offerings, improved customer experiences, and accelerating efficiency and productivity, but the technology is changing rapidly.  Enterprise organizations of every size are under pressure to identify, choose, build, and deliver these AI solutions. They need ready-to-use solutions that provide a quick ramp up and allow scaling at their own pace without making large investments or having a strong AI knowledge base.

The key challenges that they face include: 

  • Option fatigue. They are met with too many options and approaches for getting started with their AI projects.
  • Budget, hardware, and data requirements. There is no one-answer-fits-all solution. Both predictive and generative AI models require large data sets, skilled personnel, and specialty hardware and deciding where to train or deploy models can affect budget, hardware accessibility, and data concerns.
  • Scale across the model and application lifecycle.. Once models are ready to be incorporated into existing and new applications, then a new set of challenges arises such as: scalability, resource management (storage and hardware), lifecycle management, and monitoring. As more and more applications are modernized with AI, enterprise organizations will need to consider the impact on IT operations and the effort it takes to manage and automate the lifecycle of both models and applications.

Join this Red Hat webinar to learn how Red Hat’s approach to AI helps organizations overcome the challenges of getting started quickly and scaling AI deployments consistently with enterprise-ready, curated open source AI innovation. 

Live Event Date: Tuesday, October 29, 2024 | 11 AM ET

Speaker: Jennifer Vargas

Webinar 2

Simplify AI Adoption for faster time to value

Technology breakthroughs are democratizing AI and making it more accessible to build gen AI applications and organizations are now more prepared than ever to take advantage of this innovation,  but they still struggle to realize value quickly. 

Lack of collaboration between AI developers/data sciences, app developers and IT cause friction that creates bottlenecks. AI development is a process, not a singular step and it requires organizations to operationalize AI development through a MLOps process in collaboration with their DevOps workflows.

Join this webinar to learn how Red Hat helps organizations to enable the right technology and involve the right people to realize value faster from AI with a common modular, composable platform to build, train, tune, serve, deploy, monitor and manage AI workloads in any environment.  You will learn how Red Hat’s approach: 

  • Simplifies adoption and reduces cost
  • Supports existing use cases as well as future innovation
  • Accelerates AI across the hybrid cloud

Live Event Date: Tuesday, November 5, 2024 | 11 AM ET

Speaker: Martin Isaksson

Webinar 3

Deploying containerized AI-enabled applications

AI has become an integral part of modern applications, enabling personalized experiences, predictive analytics, automation, and more. Deploying AI-enabled models requires addressing several key issues and a strategy for deployment on a purpose-built platform.  Assessing your data’s quality, volume, and relevance and choosing the right model to balance accuracy with performance is critical.  But deployment isn't just about the model—it's also about the infrastructure. Organizations must consider the compute resources, storage, and networking requirements as well as legal and ethical standards, to ensure scalability, security, flexibility and ease of use.

Red Hat helps to automate the entire AI lifecycle from model development to deployment and monitoring, leading to more reliable AI applications and quicker iteration cycles. This enables organizations to build, deliver, and manage their own AI-enabled applications and services across any cloud, on-premise, or edge environment

Join this webinar to learn about:

  1. Key considerations before deployment
  2. Choosing a platform for AI deployment
  3. Deployment strategies and best practices

Live Event Date: Tuesday, November 12, 2024 | 11 AM ET

Speaker: Diego Torres Fuerte

Speakers

Jennifer Vargas

Red Hat

Jennifer Vargas

Senior Principal Product Marketing Manager, AI Business Unit, Red Hat

Jennifer is a product marketer helping organizations navigate their transition to the cloud and adoption of AI initiatives. During the last 5 years, she has been working on a variety of emerging technologies by developing product strategies, launching new products, and testing new initiatives in nascent market segments. She enjoys solving business and technical challenges that appear disconnected. Her passion towards software led her to change industries from oil and gas to information technology. Prior to joining Red Hat, Jennifer worked as a strategist consultant and technical sales lead providing customers with proven solutions for complex scenarios in the utility, telco and insurance industries. 

Martin Isaksson

Red Hat

Martin Isaksson

Lead GTM & Business development EMEA, AI BU, Red Hat

Over the last decade, Martin has been focused on exploring the opportunities that AI and deep learning offers to business leaders. With experience ranging from conducting AI research with Stanford University to leading a Silicon Valley AI startup, Martin now uses his skills to lead Red Hat’s AI go-to-market strategies and business development in EMEA.

Diego Torres Fuerte

Red Hat

Diego Torres Fuerte

Managing Architect, AI Practice Pre-sales, Red Hat

Diego Torres is a software architect with more than 10 years of experience leading customers to implement intelligent applications through the use of process automation, decision management, and artificial intelligence. As the managing architect at AI Practice pre-sales, he leads a talented team of consulting architects that provide AI Consulting services and drive adoption of the Red Hat Openshift AI product.  Diego’s background on software development, and technical enablement provides insightful advice to a growing interest in emerging technologies such as predictive and generative AI.