Matt Hicks has been a Red Hatter since 2006. In that time he’s seen the market for enterprise software evolve from one obsessed with complex closed source solutions, to one more focused on services, on-demand software development tools, open source and global scale. Sitting atop that current pile of innovative technologies are things like AI and quantum computing.

Today, Hicks is Red Hat’s president and CEO, and in that position, he’s grappling with piloting the ship through narrow straits of technological promise. Which waters is he intending to navigate? Which technological river will be left for others to explore and which ones will Red Hat chart for the industry as a whole? In this video interview with Stu Miniman, Hicks discusses these and other topics around Red Hat and its technology portfolio as a whole.

When it comes to AI, quantum computing and other new technologies, Hicks says, ”You can think of this almost as how we work with an Intel on it. That depth in research… you know we don't fabricate chips, but Intel does. We don't build foundation (AI) models, but IBM does. Our ability to work at a level of depth that we don't have to tune our software to [those things]... it's really been refreshing. It's been cool. I've been multiple times now to [IBM Research in] Yorktown as we both look at the AI area. But then that's starting to lead to new research areas like Quantum [computing], which is only a step removed from this space. Quantum will impact AI,  which will impact these same platforms. We've worked with IBM research for 20 years at this point. We have done the most in the last year with those teams.”

So while Red Hat focuses on the underlying infrastructure to enable these innovative technologies, Hicks is also focused on the developers who will have to consume these services and projects, integrating them into their applications. Traditionally, this has been performed through cloud APIs.

“I'm not a big fan of [systems like] cloud APIs [where] I call it, and magic happens. I've seen that a lot in the last 20 years, about different technologies that people can't learn. I think a lot of developers don't like a magic layer; they really like the visibility. They learn from it. I've spent probably the last two months on things like transformer models, but other variances as well,” said Hicks.

He added that the questions he wants to solve for developers are much more focused on workflows than APIs. “Can I make a vision classifier work on my laptop with no Internet? Can I make an audio classifier work? How many large language models can I get to work? Can I fine tune them? Can I give it some of my own audio data and train it to be able to identify a new sound, a new picture?” posed Hicks.

These and other crucial questions about the future of enterprise cloud are answered and ruminated upon in this week’s episode of In the Clouds.

About the author

Red Hatter since 2018, tech historian, founder of, serial non-profiteer.

Read full bio