Con su cuenta de Red Hat puede acceder a su perfil de miembro y sus preferencias, además de a los siguientes servicios teniendo en cuenta su estado de cliente:
¿Aún no se ha registrado? Le ofrecemos varios motivos por los que debería hacerlo:
- Consulte artículos de la base de conocimiento, gestione casos de soporte y suscripciones, descargue actualizaciones y mucho más desde una misma ubicación.
- Consulte los usuarios de la organización, y edite la información, preferencias y permisos de sus cuentas.
- Gestione sus certificaciones de Red Hat, consulte el historial de exámenes y descargue logotipos y documentos relacionados con las certificaciones.
Con su cuenta de Red Hat puede acceder a su perfil de miembro, sus preferencias y otros servicios dependiendo de su estado de cliente.
Por seguridad, si está en un equipo de acceso público y ya ha terminado de utilizar los servicios de Red Hat, cierre sesión.Cerrar sesión
Linux containers are often positioned as disruptive to traditional virtualization, frequently culminating in the question: Will containers kill virtualization? It’s a fair question, given the shared similarities in workload isolation, resource utilization, and so on, but the answer is a hard "no." They’re complementary, each solving a unique challenge for the enterprise; that said, historically they don’t actually integrate or work well together. This means separate application stacks, separate developer workflows, and so on.
That was then, however; now, Red Hat is helping to bring containers and virtualization together with a new project that we’re demoing today at Red Hat Summit aimed at bringing container-native virtualization technology to Red Hat Cloud Suite.
Based on the open source KubeVirt community project, container-native virtualization enables developers to work with virtual machines (VMs) in the same way that they would work with Linux container-based applications. This means that developers can create and add virtualized applications to their projects from the OpenShift Service Catalog as they would with containerized apps - the resulting application can then run side-by-side with cloud-native workloads on Red Hat OpenShift.
Why is this such a big deal? As digital transformation shapes enterprise IT strategies, the ability to quickly develop, deploy and maintain applications becomes critical. The siloed nature of VMs and containers, from the underlying platforms to the developer and operations workflows, can stifle these efforts with undue complexity, high costs, and limited integration. With container-native virtualization, application development is modernized, making it easier for developers to pull traditional workloads in VMs into the same workflow as containerized applications, helping them to deliver new, composite applications while cutting complexities at the same time.
This also factors into Red Hat’s broader vision for the future of IT: a cloud-native enterprise built on service-defined infrastructure. Think of it as a further abstraction of enterprise technology, an abstraction that started with Linux decoupling the operating system from being hardware dependent. Service-defined infrastructure aims to apply this same capability to the products and projects that power the enterprise today, providing infrastructure services such as networking, storage, tenancy, and identity as a service, to application platforms and container platforms alike regardless of IT footprint. From that solid foundation, everything, from virtualization and storage to containers and even cloud resources, can be consumed as a service through an API.
From there, service-defined infrastructure can automate common management tasks, like resource and service provisioning, to better enable self-service, auto-scaling, and operations. Existing infrastructure platforms like OpenStack can evolve to provide these capabilities to container platforms and virtualization infrastructure, bridging the rich ecosystem of supported resources and solutions from many vendors to be accessible in cloud-native applications in a way that retains their hybrid cloud flexibility and portability.
With service-defined infrastructure, enterprises would no longer worry about what footprint their datacenter is based on or what cloud provider they’re using. IT teams would simply provision and consume services similar to how they do applications today, with nearly every enterprise IT need available "as-a-service." This, obviously, is some ways away, but container-native virtualization (and container-native storage from Red Hat before it) is a key component towards achieving this vision, straddling the line between service and infrastructure and enabling traditional workloads to be treated in much the same way as more modern, cloud-native applications.
Container-native virtualization is still quickly evolving via the upstream KubeVirt project but is ready for early adopters to try. We plan to make it available initially as Technology Preview in a future version of Red Hat OpenShift Container Platform as part of Red Hat Cloud Suite. Stay tuned to learn more about container-native virtualization and how Red Hat aims to make enterprise IT infrastructure an enabler of digital transformation, rather than a barrier.
About the authors
Lars Herrmann is always found at the forefront of technology. From the early days of Linux to today’s digital transformation built on hybrid cloud, containers and microservices, Lars has consistently helped enterprises leverage open source technologies to drive business results. At Red Hat, Lars leads Red Hat Partner Connect, Red Hat's technology partner program.
Red Hat is the world’s leading provider of enterprise open source software solutions, using a community-powered approach to deliver reliable and high-performing Linux, hybrid cloud, container, and Kubernetes technologies.