A few weeks ago, we shared our thoughts about running containers in cars. Today, let's discuss our vision for the experience of developing containerized in-vehicle applications.
In our previous blog post, we discussed using Podman and systemd to run containers in cars. You might be asking yourself:
- Does this mean that developers won’t be able to leverage their experience working with Kubernetes or Red Hat OpenShift?
- Will this prevent the automotive industry from attracting new talent because developers need to learn yet another new process for container management and deployment?
- Do we need to create new CI/CD (continuous integration/continuous delivery) pipelines rather than relying on existing infrastructure and tools, and won’t this slow down the development process?
In a word—no. Let us explain further.
If you work with containers, you are most likely familiar with Kubernetes objects often expressed in .yaml format (Kubernetes YAML). Using Kubernetes YAML to describe how to deploy containerized applications is becoming a de facto standard. We want to leverage this standard to enable a model in which local application development, virtual testing, hardware in the loop testing and final deployments share common ground, namely using Kubernetes YAML to describe how an application should be deployed.
Podman has the ability to run containers using Kubernetes YAML files as input. Our recent blog post, How to "build once, run anywhere" at the edge with containers, describes how to deploy containers by using either Podman or Kubernetes (and thus OpenShift) using the same Kubernetes YAML. Following the demo in that blog post, you will retrieve a set of Kubernetes YAML files—describing how to deploy an application—that you may use to deploy that application on a Kubernetes or OpenShift cluster using kubectl apply
(or oc apply
). Then, you will use the same files to deploy and run that application locally using podman kube play
.
Using this mechanism, we can foresee an architecture in which developers build and test their applications locally using Podman. Once satisfied, they can then push their source files, Containerfile, and Kubernetes YAML to one or more repositories, all of which would then be used in a continuous integration pipeline to compile the code into a container image. That container image can then be deployed according to the instructions in the Kubernetes YAML and tested as desired. If the tests pass, the application can then move on to the next testing phase, be that another virtual testing phase or onto actual hardware. All of these files (sources, Containerfile and Kubernetes YAML) are sufficient to enable running (and thus testing) the application in all contexts.
In summary, we wanted to point you to our previous blog post to highlight what it means for the automotive industry, which is to say that Kubernetes knowledge can be transferred from the IT industry to the automotive sector. This will help reduce the amount of net-new knowledge engineers need to learn when joining the automotive industry and accelerate the development process of containerized applications by leveraging the existing container ecosystem.
Learn more
Sobre los autores
Pierre-Yves Chibon (aka pingou) is a Principal Software Engineer who spent nearly 15 years in the Fedora community and is now looking at the challenges the automotive industry offers to the FOSS ecosystems.
Ygal Blum is a Principal Software Engineer who is also an experienced manager and tech lead. He writes code from C and Java to Python and Golang, targeting platforms from microcontrollers to multicore servers, and servicing verticals from testing equipment through mobile and automotive to cloud infrastructure.
Alexander has worked at Red Hat since 2001, doing development work on desktop and containers, including the creation of flatpak and lots foundational work on Gnome.
Más similar
Navegar por canal
Automatización
Las últimas novedades en la automatización de la TI para los equipos, la tecnología y los entornos
Inteligencia artificial
Descubra las actualizaciones en las plataformas que permiten a los clientes ejecutar cargas de trabajo de inteligecia artificial en cualquier lugar
Nube híbrida abierta
Vea como construimos un futuro flexible con la nube híbrida
Seguridad
Vea las últimas novedades sobre cómo reducimos los riesgos en entornos y tecnologías
Edge computing
Conozca las actualizaciones en las plataformas que simplifican las operaciones en el edge
Infraestructura
Vea las últimas novedades sobre la plataforma Linux empresarial líder en el mundo
Aplicaciones
Conozca nuestras soluciones para abordar los desafíos más complejos de las aplicaciones
Programas originales
Vea historias divertidas de creadores y líderes en tecnología empresarial
Productos
- Red Hat Enterprise Linux
- Red Hat OpenShift
- Red Hat Ansible Automation Platform
- Servicios de nube
- Ver todos los productos
Herramientas
- Training y Certificación
- Mi cuenta
- Soporte al cliente
- Recursos para desarrolladores
- Busque un partner
- Red Hat Ecosystem Catalog
- Calculador de valor Red Hat
- Documentación
Realice pruebas, compras y ventas
Comunicarse
- Comuníquese con la oficina de ventas
- Comuníquese con el servicio al cliente
- Comuníquese con Red Hat Training
- Redes sociales
Acerca de Red Hat
Somos el proveedor líder a nivel mundial de soluciones empresariales de código abierto, incluyendo Linux, cloud, contenedores y Kubernetes. Ofrecemos soluciones reforzadas, las cuales permiten que las empresas trabajen en distintas plataformas y entornos con facilidad, desde el centro de datos principal hasta el extremo de la red.
Seleccionar idioma
Red Hat legal and privacy links
- Acerca de Red Hat
- Oportunidades de empleo
- Eventos
- Sedes
- Póngase en contacto con Red Hat
- Blog de Red Hat
- Diversidad, igualdad e inclusión
- Cool Stuff Store
- Red Hat Summit