The following is an excerpt from my Red Hat Summit keynote today.

 

Just over a decade ago, Marc Andreessen pointed out that software was eating the world. We can definitively update his quote to be more accurate: "Software ate the world." Software has taken over our businesses and how you create value for your customers.

I think about it this way: We are a software factory, and we’re helping you become a software factory. With these capabilities, you can build the future you want. You can choose where to run your applications based on your business needs with the flexibility of hybrid cloud (grounded in Linux and open source). Finding repeatability in software production pipelines to avoid hand-crafted mistakes with a common platform like OpenShift. You can tame the complexity of distributed systems with automation using Ansible and de-risking security in software supply chains from development to production with tools like ACS.

Software is your business, and no one could be in the software business without developers. The ability to move rapidly from experiment to production is the hallmark of a high-velocity development team. That ability is even more important when working with the edge.

Some of our data-overload challenges feel similar to what we’ve experienced in software development, but today we’re working in a new space – with lots of complexity. In other words, while software ate the world, artificial intelligence is now eating software.

This is happening because so much software is now interacting with the world. Just as businesses differentiate with software, they’re looking for insights from data. Businesses want to be more data-driven and are leveraging data and AI to get there – this is how we can enable our people to make smarter decisions, as I mentioned at the outset. Not only can we make better use of data to make more informed decisions, but we can also deliver better customer experiences by embedding intelligence in the products and services our customers use. 

Guess what? Red Hat had the same challenges, and as we worked to manage our own needs -- or scratch our itch -- we discovered that we were not alone. True to our roots, we brought our work into the open, creating a community project. And there, we could share what we’ve learned about data science and machine learning with customers and partners. That project is Open Data Hub, a blueprint for building an AI-as-a-service platform. 

It serves as the foundation for the data science and AI platform we launched last year: Red Hat OpenShift Data Science. And as we’ve learned, AI is not a one-and-done endeavor. You have your software development pipeline. Do you have an AI development pipeline? 

Think about it: Your source code is analogous to data, and your deployed applications are analogous to deployed machine learning models. The discipline of going from source code to testing to production of software at scale is well understood. But with AI, are you applying that same discipline from development to deployment? And that needs to happen at scale.  

If you consider an average enterprise is made up of a few thousand applications, it will also be made up of thousands of machine learning models. As decisions rely more and more on AI/ML – I don’t know about you, but I want to establish trust in the model making those decisions to have confidence in taking action.

Part of building that trust is through: 

  • collaboration – helping to build the model

  • transparency – understanding what went into the model

  • auditability – seeing what changes were made to models and the impacts those changes had on the outcomes.

For CIOs, the peace of mind that came from datacenters and IT assets remaining safely ensconced within the four walls of headquarters is long gone. The advent of the cloud, blazing-fast processors, improvements in wireless networks, and the spread of far-flung but crucial remote operations have come together to make sure of that. But the technical freedoms we take advantage of today aren’t without challenges. This is where we believe edge computing will be transformative.

Edge computing is the ability to generate insights from data and act on these locally where it matters. Intelligent devices are pushing the boundaries of where computing can happen – on earth, in space, and wherever else there’s a benefit to an enterprise or, perhaps, humanity itself. 

Edge computing can now take place at or near the physical location of either the user or the source of the data — whether that’s an SUV speeding down the highway, sensors monitoring a natural-gas pipeline in the middle of nowhere, or on-board a satellite orbiting the earth.

That is hybrid, and that is the future.

With Red Hat, your workloads can span the typical IT footprints – from datacenters, to clouds, to the edge. We deliver innovation from open source communities that provides consistency you can count on and the flexibility to choose where and how you securely build and deploy your applications and ML models.

Let’s talk about security. There’s plenty of risk out there: The Apache log4j vulnerabilities demonstrated that enterprises have to be aware of what open source they have deployed and how actively they are managing it.

Open source is in almost 99% of audited codebases. Now, come to think of it, I think we can say that "open source software ate the world." But that ubiquity makes you a target: In 2021, there was a 650% year-over-year increase in software supply chain attacks aimed at exploiting weaknesses in upstream, open source ecosystems. 

Without a doubt, this is top of mind for companies, like Red Hat, that produce software. And, of course, for you: As you all continue to build your business and differentiation through software. It’s at the top of government agendas around the world, especially with the continued rise of ransomware and protest-ware. We must ensure that the integrity of software updates is protected and verified across the entire development lifecycle.

In other words, the key for enterprise use of open source is to make sure you’re aware of what you’re using, where it’s being used, and how it’s being used. Of course, Red Hat takes care of the provenance and safety of open source code we deliver in our products. We’re also building and delivering tools for things you do on your own.

I will leave you with this. What is the future that you want to build?

Because Red Hat believes that open unlocks the world’s potential, we’d love to help you start building your future today.


About the author

Chris Wright is senior vice president and chief technology officer (CTO) at Red Hat. Wright leads the Office of the CTO, which is responsible for incubating emerging technologies and developing forward-looking perspectives on innovations such as artificial intelligence, cloud computing, distributed storage, software defined networking and network functions virtualization, containers, automation and continuous delivery, and distributed ledger.

Read full bio