Stay current with Red Hat Enterprise Linux. Attend this session to learn about plans, new features, and active development from the people responsible for the product. See how we're responding to our customers.
Whether you are developing smart products, enabling new connected services, or instrumenting factory production lines, you are facing the challenges of designing, building, and deploying an IoT solution.This session will first summarize the leading reference architectures and blue prints for Indu
Learn to build robust clusters that provide high availability and the ability to run large numbers of applications in our Red Hat OpenShift Administration II (DO380) course.
Learn to transform a simple Java SE command line application into a multi-tiered enterprise application using Java EE specifications, including Enterprise Java Beans, Java Persistence API, Java Messaging Service, JAX-RS for REST services, Contexts and Dependency Injection (CDI), and JAAS for secu
Learn to develop and deploy containerized software applications through hands-on training in our Red Hat OpenShift Development I: Containerizing Applications course. Register today redhat.com/training
Acquire the skills you need to deploy, administer and operate virtual machines in your organization using Red Hat Virtualization in our Red Hat Virtualization (RH318) course. Register today redhat.com/training
Missed Red Hat Summit 2018 in San Francisco? Check out some key moments from the main stage, hear from attendees, and and see what makes Red Hat Summit unique.
Join us next year in Boston, May 7-9, 2019.
תיאור: סרטון קצר זה מסכם את הערך שתוכנית Red Hat Connect מציעה לספקי תוכנה לאורך מחזור החיים העסקי שלהם.
Pranesh Medilall is a Red Hat Certified Engineer. He works for rhipe, a Red Hat Certified Cloud and Service Provider Distributor. In the last year alone, Pranesh has collaborated with 10 key partners to deliver innovative projects using Red Hat technologies.
Apache Kafka has emerged as a leading platform for building real-time data pipelines. It provides support for high-throughput/low-latency messaging, as well as sophisticated development options that cover all the stages of a distributed data-streaming pipeline—from ingestion to processing.