WHY A HYBRID CLOUD?
A hybrid cloud originally just meant a cloud that combined private and public cloud resources. But, as cloud computing has evolved, users think of hybrid in broader terms.
Today, hybrid also covers heterogeneous on-premise resources, including private clouds, traditional virtualization, bare-metal servers, and containers. It encompasses multiple providers and types of public clouds.
In short, IT infrastructures, and the services that run on them, are hybrid across many dimensions. There is a simultaneous requirement in most organizations to both modernize and optimize their software-defined datacenters (SDDC) and deploy new cloud-native infrastructure. Most organizations use services from several public clouds. And there is a widespread need to bridge and integrate across these different infrastructures to allow for consistent processes and business rules, as well as for picking the best infrastructure for a given workload.
However, hybrid should not mean silos of capacity. Adding cloud silos increases complexity rather than reducing it.
This is not to say that we cannot start our journey to a cloud on a subset of infrastructure. In most cases, a pilot project or proof-of-concept using a subset of applications will indeed be the prudent path. The difference is that a proof-of-concept is a first step; a new silo is a dead end.
Taking an open approach to cloud is a key way to avoid a siloed cloud future.
INNOVATION THROUGH OPEN SOURCE
Entire new categories of software are open source by default. That’s because the community development model works. Open source underpins the infrastructure of some of the most sophisticated web-scale companies, like Facebook and Google. Open source stimulates many of the most significant advances in the worlds of cloud infrastructures, cloud-native applications, and big data.
Open source enables contributions and collaboration within communities, with more contributors collaborating with less friction. Furthermore, as new computing architectures and approaches rapidly evolve for cloud computing, big data, and the Internet of Things (IoT), it is also becoming evident that the open source development model is extremely powerful because of how it allows innovations from multiple sources to be recombined and remixed in powerful ways. To give just one example, the complete orchestration, resource placement, and policy-based management of a microservices-based, containerized environment can draw on code from many different communities and combine it in different ways depending upon the requirements.
The open source development model and open source communities help to:
- Provide the interoperability and workload portability that cloud users need.
- Enable software-defined, cloud-native infrastructures, their applications, and DevOps processes for developing and operating those applications.
- Create the bridges between new infrastructures and workloads and classic IT—for example, by connecting back-end systems to new applications through business rules and message busses.
- Preserve existing investments while providing IT with the strategic flexibility to deploy on their infrastructure of choice, whether physical servers, legacy virtualization, private clouds, or public clouds.
Figure 1. The open hybrid cloud journey
BEYOND OPEN SOURCE IN THE CLOUD
The “open” in open hybrid cloud is about more than open source code. As we have discussed, it is also about engaging with innovative communities. It is about interoperability, workload portability, and strategic flexibility. And it is about making open source suitable for critical deployments through quality assurance and integration, working within upstream projects, and having predictable and stable life-cycle support.
Open source allows adopters to control their particular implementation and does not restrict them to the technology and business roadmap of a specific vendor.
A viable, independent community is the single most important element of many open source projects. Delivering maximum innovation means having the right structures and organization in place to fully take advantage of the open source development model.
Open standards do not necessarily require formal standardization efforts, but they do require a consensus among communities of developers and users. Approaches to interoperability that are not under the control of individual vendors, or tied to specific platforms, offer important flexibility.
Freedom to use intellectual property (IP) is needed to use technology without constraints. Even “reasonable and non-discriminatory” license terms can still require permission or impose other restrictions.
Platform choice lets operations and application development teams use the right infrastructure. Tools like cloud management should not be tied to a specific virtualization or other foundational technology. For example, at one time, managing just physical servers and virtual machines was a reasonable goal for a management product. Then private cloud and public cloud. Then more public clouds. Now containers as well.
Portability can be a tradeoff. Sometimes, using a feature that is specific to a particular public cloud provider is the right business decision. Nonetheless, technologies such as container and cloud management platforms can maximize the degree to which applications and services can be deployed across a variety of infrastructure. And redeployed elsewhere if needs or conditions change.
HOW RED HAT DELIVERS OPEN SOURCE VALUE
At Red Hat, our focus is on making open source technologies consumable and supportable by enterprise IT. Red Hat’s business model is 100% open source—no bait-and-switch, and no open core holding back valuable bits as proprietary add-on software.
We collaborate through upstream projects because doing so is at the heart of the economic and business model that makes open source such an effective way to develop software. Working upstream lets Red Hat engage closely with the open source community and influence technology choices in ways that are important to our customers, our partners, and us. It helps ensure that we use the strengths of open source development and maintain the technology expertise to provide fast and knowledgable product support, while also working with the community to encourage innovation.
Red Hat has a well-established process for turning open source projects into enterprise subscription products that satisfy the demands of some of the most challenging and critical applications in markets such as financial services, government, and telecommunications. Red Hat is also focused on creating value through a portfolio of products and an ecosystem of partners.
To meet the challenges brought by the digitization of the business, IT needs to simultaneously close three serious gaps. It needs to build a comprehensive cloud-native infrastructure to close the gap between what the business requires and what traditional IT can deliver. It needs to deliver applications, services, and access to infrastructure that is in line with what both customers and employees have come to expect from consumer devices and public cloud services. And it needs to do this iteratively and quickly, while maintaining and connecting back to the classic IT on which core business services are running.
Individual organizations will achieve these various goals in a variety of ways. But the vast majority will do so in a hybrid manner. They will modernize and optimize existing assets to retain and extend their value. They will build and deploy new cloud-native infrastructures to provide the best platform for quickly and iteratively delivering needed business services for internal and external customers. They will use resources from a variety of public clouds.
But making the most effective use of these disparate types of technology means that taking an open approach to cloud is not a nice-to-have for IT organizations. It is a must-have.