The world is not the same as it was a few months ago. No matter where we are, we have all been experiencing a new way of life. Social distancing and sheltering in place mandates mean that companies are reevaluating how they work. With an unprecedented number of workers shifting to working from home full time, many for the first time in their careers, companies are learning how to adapt to a completely distributed workforce. We are suddenly faced with new challenges, including balancing working from home while parenting and keeping our families and loved ones safe when we leave the house for groceries. In work: how can we maintain business continuity? What do our customers need right now, and what is the best way to serve them?

At Red Hat, we live and breathe open source. It is not just what we do, it is who we are. And, more than ever, we believe open source technologies—and the communities behind them—are responsible for driving new innovations, such as edge computing, and also the real impact these technologies can have on our world, and how they shape our future. I am proud to showcase how Red Hat is leading the way when it comes to creating and enabling the power of hybrid cloud technologies paired with our commitment to open source communities to help address many of the real challenges our customers face today, and make a real impact on the world.

The changing datacenter

As I will discuss in my keynote during the Red Hat Summit Virtual Experience today, the role of data in our lives is rapidly evolving to fit the unique challenges we are facing today.

We have so much data at our fingertips that the very concept of the datacenter is changing. With the proliferation of devices generating data, pulling information from all those streams of data and using that information to make better decisions in pace with our fast moving world requires a new architecture. We need a robust, decentralized way to manage and process data, which pushes compute out to the edge of the network.

Managing the cost and complexity of this new architecture at scale is a challenge for any business. What’s more, it is not just about the quantity of data, but the quality of information derived from it. A company’s success depends on the quality of the information they have access to. And, to be able to successfully use all of the data, they need a common platform to aggregate information from the edge to centralized infrastructure in both the public and private clouds, with ease.

This is the hybrid cloud in action.

Red Hat believes the open hybrid cloud is the best way to make sense of all this data, managed in a federated data store, summarizing information, finding patterns, taking actions, and driving better business outcomes. An open hybrid cloud strategy minimizes latency while maximizing impact. Open hybrid cloud is bringing us into the future.

The importance of edge computing

As more and more of the global workforce has transitioned to remote work, networks are being pushed to new limits. Organizations now rely on things like video conferencing and streaming as an essential part of keeping the lights on. Because of this, expectations for edge computing to support our highly distributed needs are growing. Edge computing takes place at or near the physical location of either the data or the user, to enable the customer to benefit from fast and reliable services while enabling the company to benefit from the flexibility of hybrid cloud.

Accelerating 5G together with edge computing is quickly becoming critical across the globe. In addition to our most up-front challenges, edge computing can help with virtual and augmented reality, which have historically suffered from latency and bandwidth issues that can mean a delay or lag, breaking the experience for users. An immersive experience enables richer connections at work and at home, and can enable remote delivery of critical services like telemedicine. Because edge computing means that the compute and data intensive parts of the rendering pipeline can be offloaded to the cloud, this is a robust and scalable architecture. Eventually, we could see this used in self driving cars, or other automated IoT applications. For example, BMW Group, one this year's Red Hat Innovation Award winners, offers close to 230 PB of usable storage and the compute power to simulate up to 240 million kilometers of test data. Autonomous driving models will be pushed out to the edge in connected vehicles, while model training happens in cloud-scale environments.

Red Hat is working with the open source community, customers and partners to help organizations looking to accelerate their edge computing strategies. We have worked with companies like NVIDIA to enable edge AI and 5G vRAN for our customers, worked with Vodafone Idea Limited to launch an intelligent, distributed edge computing platform in 100-plus locations across India and most recently Red Hat helped Verizon build a cloud-native 5G core. Edge computing allows power for computing to be transferred from the few and put in the hands of many and we’re excited to help organizations across the world get edge ready.

Artificial intelligence and machine learning

Red Hat combines our work with key open source communities developing cutting-edge AI/ML technologies, our core platforms and services, and our rich partner ecosystem around our Kubernetes-based Red Hat OpenShift Container Platform, to help developers build solutions to real life problems.

These solutions, which used to take months of work deploying monolithic systems, can now be achieved in days. Practices like artificial intelligence (AI) DevSecOps mean that developers can pick the tools that best fit their needs and jobs, while the IT and Security teams have the peace of mind knowing that the pipeline, artifacts and deployment are governed and repeatable with strong security.

We have been working closely with members of our Open Data Hub community, which is a blueprint for building an AI-as-a-service platform on OpenShift and Red Hat Ceph Storage. Upstream, it inherits work from Kafka/Strimzi, and is also the foundation of Red Hat’s internal AI and data science program. We are proud to be working with technology partners such as Cloudera, Anaconda and SAS to help propel their use of Artificial Intelligence and other emerging technologies ahead in very real ways, like with facial recognition for security applications and monitoring safety on the factory floor.

Quantum computing

About 10 years ago, the convergence of smartphones, 4G networks and cloud computing laid the foundation for the world we take for granted today but couldn’t imagine then. Moving forward, the combination of device proliferation, 5G networks, and edge computing are coming together to set the stage for the next generation of innovation that has the potential to make profound impacts on our lives in ways that are hard to imagine today.

As our data continues to grow exponentially and we model more and more complex systems, how can computer processing keep up? One of the most fascinating emerging computational techniques is quantum computing. Harnessing the mysteries of quantum mechanics to operate on qubits creates the potential to solve a new scale of challenges. This is especially powerful in modelling the complexities of the physical world and could give rise to breakthroughs in areas like material science or medicine.

This doesn’t mean that classical computing is going away, rather it shows that heterogeneous computing, leveraging the best compute environment for the task, is here to stay. In discussion with leaders of the quantum computing field, it’s also clear that open source and community collaboration could play a major role in the evolution of quantum computing. If you would like to learn more, tune in to tonight’s keynote, where I will host a panel discussion with leading quantum computing experts to learn what quantum computing is, how it relates to classical computing, and what is possible with this exciting new technology.

Red Hat’s commitment to open source

Red Hat is unrelenting in our commitment to our open source. We understand that strong communities are critical to our products, and that keeping communities healthy is not only about maximizing adoption, but also about enabling a project’s longevity. Red Hat actively participates in upstream communities to develop depth of expertise and confidence in our product offerings. It also enables us to incorporate customer requirements faster, quickly fix bugs and ship the best possible user experience we can. One of the ways we ensure long term supportability of our products and sustainability of community projects is following an upstream first model. Product changes we make go into the respective upstream communities first. We give back so that we can all move forward.

We have been all-in on open source from the beginning. When we started over 25 years ago, it was not with open core products but truly open source offerings. Open core defeats the purpose of open source, because it doesn’t bring all of the benefits of the community to our customers. Red Hat has the same perspective today. It’s what makes us Red Hat.

Upstream in action and operate first

We are taking our commitment to “upstream first” one step further, with operate first.

A key attribute of the cloud is operational excellence. A hybrid cloud is no different. An open hybrid cloud can bring the power of open source communities to the operational models of a cloud. As it stands now, there is no upstream equivalent for operations, and with the introduction of operate first, we are looking to change this. With operate first, communities, and ultimately our products, can build operational knowledge directly into software. Leveraging tools such as Ansible and the Operator SDK, Red Hat is working with operations practitioners to glean insights into operational needs as a part of the software development cycle. When we operate first, we share operational knowledge, which can be as important as the code itself.

The world is changing. But Red Hat and open source technologies are here to help address many of the new, and very real challenges that customers face today.


Sobre o autor

Chris Wright is senior vice president and chief technology officer (CTO) at Red Hat. Wright leads the Office of the CTO, which is responsible for incubating emerging technologies and developing forward-looking perspectives on innovations such as artificial intelligence, cloud computing, distributed storage, software defined networking and network functions virtualization, containers, automation and continuous delivery, and distributed ledger.

Read full bio