Subscribe to our blog

(Image source opensource.com)

Kubeflow is an open source project that provides Machine Learning (ML) resources on Kubernetes clusters. Kubernetes is evolving to be the hybrid solution for deploying complex workloads on private and public clouds. A fast growing use case is using Kubernetes as the deployment platform of choice for machine learning.
Infrastructure engineers will often spend time modifying deployments before a single model can be tested. These deployments are often bound to the clusters they have been deployed to, thus moving a model from a laptop to a cloud cluster is difficult without significant re-architecture.
The open source Kubeflow project addresses these concerns by enabling Github Machine Learning stacks on Kubernetes portable across environments. The repository contains:

  • JupyterHub to create & manage interactive Jupyter notebooks
  • A Tensorflow Custom Resource (CRD) that can be configured to use CPUs or GPUs, and adjusted to the size of a cluster
  • A TF Serving container

Installing Kubeflow

There are two parts to Kubeflow on Kubernetes:

  1. A hypervisor - Kubernetes creates clusters of containers. These clusters require compute, networking and storage. The hypervisor virtualizes the host computer’s storage, networking and compute for the Kubernetes clusters.
  2. Packages on the host operating system to create clusters on the hypervisor and install packages on the cluster.

The following blog post by Boris Lublinsky from Red Hat partner Lightbend --one of nine parts in a series--details the procedures to install and configure Kubeflow on Red Hat OpenShift Container Platform.


About the author

Red Hatter since 2018, technology historian and founder of The Museum of Art and Digital Entertainment. Two decades of journalism mixed with technology expertise, storytelling and oodles of computing experience from inception to ewaste recycling. I have taught or had my work used in classes at USF, SFSU, AAU, UC Law Hastings and Harvard Law. 

I have worked with the EFF, Stanford, MIT, and Archive.org to brief the US Copyright Office and change US copyright law. We won multiple exemptions to the DMCA, accepted and implemented by the Librarian of Congress. My writings have appeared in Wired, Bloomberg, Make Magazine, SD Times, The Austin American Statesman, The Atlanta Journal Constitution and many other outlets.

I have been written about by the Wall Street Journal, The Washington Post, Wired and The Atlantic. I have been called "The Gertrude Stein of Video Games," an honor I accept, as I live less than a mile from her childhood home in Oakland, CA. I was project lead on the first successful institutional preservation and rebooting of the first massively multiplayer game, Habitat, for the C64, from 1986: https://neohabitat.org . I've consulted and collaborated with the NY MOMA, the Oakland Museum of California, Cisco, Semtech, Twilio, Game Developers Conference, NGNX, the Anti-Defamation League, the Library of Congress and the Oakland Public Library System on projects, contracts, and exhibitions.

 
Read full bio

Browse by channel

automation icon

Automation

The latest on IT automation for tech, teams, and environments

AI icon

Artificial intelligence

Updates on the platforms that free customers to run AI workloads anywhere

open hybrid cloud icon

Open hybrid cloud

Explore how we build a more flexible future with hybrid cloud

security icon

Security

The latest on how we reduce risks across environments and technologies

edge icon

Edge computing

Updates on the platforms that simplify operations at the edge

Infrastructure icon

Infrastructure

The latest on the world’s leading enterprise Linux platform

application development icon

Applications

Inside our solutions to the toughest application challenges

Original series icon

Original shows

Entertaining stories from the makers and leaders in enterprise tech