Webinar

Accelerate MLOps and deliver intelligent applications with Kubernetes, CI/CD, and GitOps

Jump to section

Artificial intelligence (AI)-powered applications are growing to deliver more mainstream experiences to users. Whether predicting rainfall and its effects on specific terrain or building a video game, AI models use machine learning (ML) to humanize the user experience in virtual or real world applications. 

Operationalizing these applications with integrated ML capabilities and keeping them up to date—known as MLOps—ensures prediction accuracy. In this session, we will show you how to accelerate MLOps using an example of an AI game of battleship we built recently that was played by 8000+ users live at Red Hat Summit.

MLOps requires collaboration amongst data scientists, developers, ML engineers, IT operations, and various DevOps technologies. This can require significant effort and coordination.

We’ll briefly discuss how data scientists build, test, and train ML models on Kubernetes hybrid cloud platforms such as Red Hat OpenShift. Next, we will explore how the integrated DevOps CI/CD capabilities in Red Hat OpenShift (i.e., GitOps and Pipelines), allow us to automate and accelerate the integration of ML models into the application development process. Ultimately, these capabilities allow consistent, scaled application deployments, which also helps accelerate the frequent redeployment of updated ML models into production.

In this webinar, we’ll talk about:

  • How Data Scientists build, test, and train ML models on Kubernetes hybrid cloud platforms like Red Hat OpenShift
  • How Red Hat OpenShift incorporates GitOps and pipelines to automate and accelerate ML model integration during application development
  • How the integrated DevOps CI/CD capabilities in Red Hat OpenShift facilitate on-demand application deployment and updates

Live event date: Thursday, July 15, 2021 | 1:00 pm ET

On-demand event: Available for one year afterward.


Natale Vinto

Natale Vinto

Principal Product Marketing Manager, Technical Product Marketing, Red Hat

Natale Vinto is a Software Engineer with more than 10 years of experience in  IT and ICT technologies working within telecommunications and on Linux operating systems. As a Solutions Architect with a JavaTM development background, he spent several years as an EMEA Specialist Solution Architect for Red Hat OpenShift.

Currently, Natale is a Developer Advocate for Red Hat OpenShift, helping people within developer communities and customers succeed with their Kubernetes and cloud-native strategy.
 

Subin Modeel

Subin Modeel

Principal Software Engineer, AI Center of Excellence, Red Hat

Subin Modeel is a Principal Software Engineer at Red Hat's AI Center of Excellence. He works as part of a team of data scientists and engineers focusing on machine learning (ML), artificial intelligence (AI), and open source AI/ML frameworks and libraries like TensorFlow, PyTorch, MXNet, and CUDA.

Subin helps customers use ML libraries in hybrid cloud environments and contributes to the TensorFlow Project. He is also a chair of the Systems Group in MLCommons and an Apache committer.