Machine learning is essential in modern software systems. It extends traditional software ecosystems with new services, including recommendations, text or speech processing, and image processing. It also implements new functions, such as anomaly detection and automatic healing of software components.
Machine learning is very attractive to many industries given the large amounts of data produced by different software systems.
[ What is edge machine learning? ]
An enterprise architect knowledgeable about machine learning can design a software ecosystem that effectively uses machine learning services. As domain experts, architects know best what kind of data is produced and available in the software ecosystem. They are in an extremely important position to decide whether using machine learning makes sense, what problems it may solve, and how to use it to extend the capabilities of their software ecosystem.
3 ways architects can use machine learning
Machine learning is a type of artificial intelligence that enables software to learn about data without explicitly being programmed to do so. Machine learning models are algorithms that imitate the way people learn. Therefore, they can improve their performance over time after "seeing" more data. Using machine learning in enterprise architectures is something worth considering for multiple reasons, including these three.
[ Check out Red Hat's Portfolio Architecture Center for a wide variety of reference architectures you can use. ]
Creating specialized applications
First, machine learning can provide functions or services inside an application ecosystem in an effort to accelerate the business process and use cases. These applications are domain-specific and have been used with great success in areas including finance (for example, fraud detection), healthcare (such as in disease prediction, medical imaging diagnostics, and new drug development), and telecommunications (examples include network anomaly detection and intelligent network self-healing).
[ Learn how to accelerate machine learning operations (MLOps) with Red Hat OpenShift. ]
Accelerating software development
Second, machine learning may accelerate parts of the software development lifecycle, such as load testing and system maintenance. This benefit also comes with better software level agreement (SLA) fulfillment.
Processing data
The third reason is that software applications and processes generate vast data. An enterprise ecosystem can employ machine learning to put that data to use by:
- Improving operations using AIOps
- Adding new service offerings for business users with intelligent data extraction based on natural language processing (NLP) techniques
- Expanding service offerings for consumers using recommender systems
Wrap up
Solution architects have the keys and knowledge to unleash the power of data through machine learning for the software solutions they design, both during the software development lifecycle and in operations. Depending on the domain, specialized (sometimes external) knowledge may be required to include machine learning in a software ecosystem. Open source software is key to releasing the potential of AI and machine learning in software ecosystems.
If you'd like to learn more about machine learning in enterprise architecture, look for my next article, where I present a demonstration of using machine learning to detect pneumonia, and a future article about a machine learning model for system capacity planning. I hope these articles inspire you to learn more about machine learning and what it can do for your organization.
[ Try OpenShift Data Science in our Developer sandbox or in your own cluster. ]
Sobre o autor
Arthur is a senior data scientist specialist solution architect at Red Hat Canada. With the help of open source software, he is helping organizations develop intelligent application ecosystems and bring them into production using MLOps best practices.
He has over 15 years of experience in the design, development, integration, and testing of large-scale service enablement applications.
Arthur is pursuing his PhD in computer science at Concordia University, and he is a research assistant in the Software Performance Analysis and Reliability (SPEAR) Lab. His research interests are related to AIOps, with a focus on performance and scalability optimization.
Mais como este
Implementing best practices: Controlled network environment for Ray clusters in Red Hat OpenShift AI 3.0
Solving the scaling challenge: 3 proven strategies for your AI infrastructure
Technically Speaking | Platform engineering for AI agents
Technically Speaking | Driving healthcare discoveries with AI
Navegue por canal
Automação
Últimas novidades em automação de TI para empresas de tecnologia, equipes e ambientes
Inteligência artificial
Descubra as atualizações nas plataformas que proporcionam aos clientes executar suas cargas de trabalho de IA em qualquer ambiente
Nuvem híbrida aberta
Veja como construímos um futuro mais flexível com a nuvem híbrida
Segurança
Veja as últimas novidades sobre como reduzimos riscos em ambientes e tecnologias
Edge computing
Saiba quais são as atualizações nas plataformas que simplificam as operações na borda
Infraestrutura
Saiba o que há de mais recente na plataforma Linux empresarial líder mundial
Aplicações
Conheça nossas soluções desenvolvidas para ajudar você a superar os desafios mais complexos de aplicações
Virtualização
O futuro da virtualização empresarial para suas cargas de trabalho on-premise ou na nuvem