"I don't have permission to do this!" That's one of many phrases that I used to hear from a couple of developers when trying to introduce them to a Configuration Management tool. This barrier was, and still is in some cases, a major obstacle in companies that didn't adhere to the DevOps movement and practices, a problem that I like to call as "The Wall". If you are questioning, yes, I am a huge Pink Floyd fan!
"The Wall" is characterized when, for some reason(s), developers, and operations don't find a middle ground and share the responsibilities. What it will probably result in is a rigid and unreliable application. In a world, where requirements are constantly modified due to customer behavior, these teams will likely fail, when trying to achieve a stable, resilient, and flexible platform. It's not easy to deal with this cultural problem, and one of the main pillars that support this is technical complexity. All sides have their own specifics, problems, architecture, languages… If we just throw packages from one side to the other, we are just adding "another brick in the wall". There is a necessity to translate this entanglement and meet halfway, working with platforms like Kubernetes, a container orchestrator that can provide a smooth approach when introducing teams to areas that they didn't know before.
The great advantage of Kubernetes is that it does not treat Infrastructure "As Code" but "As Data". In 2013, Ansible's creator, Michael Dehaan, stated that "Infrastructure is best modeled not as code, nor in a GUI, but as a text-based, middle-ground, data-driven policy," while describing that working with basic text representations, people don't need to get close to high complexity languages. On Kubernetes, like Ansible, you can easily use a human-readable key-value format called YAML to describe what you need, and it's important to notice that you are not saying "How,", you just need to say "Hey, I want this," and Kubernetes will figure it out. Obviously, you need to provide the right information and follow guidelines as well. You can even set technical specifics if there are some and you know how to do it, but users that are starting to learn, find this characteristic as an open door to step into the containers world. Despite the fact that Kubernetes works with JSON and protobuf, the orchestrator provides an easy way to interact with K8S API, creating, modifying, and deleting resources.
It's a declarative system that is highly interchangeable and extensible, two of its main advantages. Relying on CI/CD practices you can implement validation using Admission Controllers, create templates, expand it through Operators, and use other tools, like Helm, as a part of the deployment pipeline. Maintainers have the possibility to control roles, and access to functions, establishing policies to help make the platform stable and, auditable.
Kubernetes embraced DevOps culture as functionalities, features that compose a tool ready to break unnecessary organizational silos. This powerful bridge between operational and development provide a collaborative way to learn, build, rebuild, extend, and do a lot more, creating the perfect way to container-based apps, cloud environments and, Cloud-Native development.
Connect with Red Hat Services
Learn more about Red Hat Consulting
Learn more about Red Hat Training
Join the Red Hat Learning Community
Learn more about Red Hat Certification
Subscribe to the Training Newsletter
Follow Red Hat Services on Twitter
Follow Red Hat Open Innovation Labs on Twitter
Like Red Hat Services on Facebook
Watch Red Hat Training videos on YouTube
Understand the value of Red Hat Certified Professionals
Sull'autore
Gabriel Sampaio La Greca de Paiva is an Architect for Red Hat.
Altri risultati simili a questo
Deploy Confidential Computing on AWS Nitro Enclaves with Red Hat Enterprise Linux
Red Hat OpenShift sandboxed containers 1.11 and Red Hat build of Trustee 1.0 accelerate confidential computing across the hybrid cloud
What Is Product Security? | Compiler
Technically Speaking | Security for the AI supply chain
Ricerca per canale
Automazione
Novità sull'automazione IT di tecnologie, team e ambienti
Intelligenza artificiale
Aggiornamenti sulle piattaforme che consentono alle aziende di eseguire carichi di lavoro IA ovunque
Hybrid cloud open source
Scopri come affrontare il futuro in modo più agile grazie al cloud ibrido
Sicurezza
Le ultime novità sulle nostre soluzioni per ridurre i rischi nelle tecnologie e negli ambienti
Edge computing
Aggiornamenti sulle piattaforme che semplificano l'operatività edge
Infrastruttura
Le ultime novità sulla piattaforma Linux aziendale leader a livello mondiale
Applicazioni
Approfondimenti sulle nostre soluzioni alle sfide applicative più difficili
Virtualizzazione
Il futuro della virtualizzazione negli ambienti aziendali per i carichi di lavoro on premise o nel cloud