A year ago, I took the stage at Red Hat Summit and spoke about our industry having an AI moment. It had been just a few months since the world had begun to see the potential of generative AI technologies, like ChatGPT.
Since then, speculation about the possibilities of generative AI has been a rapidly-increasing drumbeat in executive boardrooms, engineering communities, and even at the dinner table with family and friends. Can it really increase my workforce productivity 10x? Is a trillion parameters a lot? Is it OK to let the model create my charts for statistics class?
My own personal experimentation with generative AI has consumed quite a few late nights. I’ve downloaded models onto my laptop. I’ve used it to write some code, write some emails, and summarize some reports. I am energized by what you can accomplish when you combine your own curiosity with the world’s information and an AI assistant that brings “expertise” to all sorts of situations. Let’s call that excitement level 8/10.
But looking back on those early experiments, I now realize that something was missing. I’d ask myself questions such as:
- “Why can’t I make this AI more personalized?”
- “Where could this AI be used to help my teams more?”
- “What if communities could improve this like we do with open source software?”
- “Will AI follow the patterns of open source, or are we going back to proprietary?”
As a business leader, I’m expected to get the best from our teams. The challenge is that we’re a dynamic organization, with associates spread around the world. We have different expertises that are needed in different places at the same time. How do we harness the skills and knowledge of our people to improve as many parts of the business as possible? So I challenged Red Hatters with this note:
Somewhere in our company, there is a subject-matter expert with skills and knowledge about some unique part of our business. And somewhere else in the company, there is a manager who wishes they had 10s or 1000s of clones of that subject-matter expert on their team. In a perfect world, there would be a simple way to transfer the skills and knowledge of one individual to the broader team or teams. This would allow the business to grow more rapidly. It would reduce the complexity of onboarding new hires. It would foster new ideas because the baseline of expertise would be available for anyone to build upon. And it might even allow that subject-matter expert to enjoy their upcoming vacation a little more, knowing that their phone won’t ring while they are relaxing on the beach.
Our leadership realized that if this was possible, not only would it improve Red Hat, but it would have the potential to significantly improve all of our customers' businesses. And if we could make it collaborative, it would also allow our partners to bring their expertise.
So as I prepare to address the Red Hat Summit this week, I’m excited to announce that we’ve made that potential a reality. In collaboration with IBM, we’re taking the next step in bringing open source to AI. Over the last two weeks, we’ve open sourced two important elements that we believe will bring a truly open experience to the world of generative AI LLMs.
First, Red Hat and IBM have open sourced the Granite language and code-assistant LLMs. This builds on the efforts that Meta and Mistral have taken to bring open models to market. We’re taking it a step further by not only making the model open (license) but also the weights and data sources.
Second, we’ve open sourced InstructLab, which uses a novel synthetic data-based alignment tuning method for Large Language Models. InstructLab allows anyone, not just data scientists, to be the subject-matter expert who can help train and tune a model.
Combined, these two announcements enable communities of people to teach LLMs to learn in the same way that humans learn.
By bringing a set of open models to the community, we can all collaborate in the open to enable models that bring value to individuals, teams and businesses. By enabling those models with a set of open tools that allows anyone to contribute to a model, and tune it in new ways, we can truly unlock the potential of subject-matter expertise anywhere.
We believe that these two announcements will deliver the next steps in bringing the richness of open communities, open contributions and open ideas to AI. By reducing the barriers to actively participating in creating AI that works for YOUR work, YOUR teams, and YOUR business, we think the next generation of AI for business is right around the corner.
So on behalf of Red Hat, we look forward to collaborating with you all in bringing all the foundations of open source to AI. Let’s see what’s possible!
About the author
Matt Hicks was named President and Chief Executive Officer of Red Hat in July 2022. In his previous role, he was Executive Vice President of Products and Technologies where he was responsible for product engineering for much of the company’s portfolio, including Red Hat® OpenShift® and Red Hat Enterprise Linux®. He is one of the founding members of the OpenShift team and has been at the forefront of cloud computing ever since.
Prior to joining Red Hat 16 years ago, Hicks served in various roles spanning computer engineering, IT, and consulting. He has worked with Linux and open source for more than 25 years, and his breadth of experience has helped him solve customer and business problems across all areas of IT.
Browse by channel
Automation
The latest on IT automation for tech, teams, and environments
Artificial intelligence
Updates on the platforms that free customers to run AI workloads anywhere
Open hybrid cloud
Explore how we build a more flexible future with hybrid cloud
Security
The latest on how we reduce risks across environments and technologies
Edge computing
Updates on the platforms that simplify operations at the edge
Infrastructure
The latest on the world’s leading enterprise Linux platform
Applications
Inside our solutions to the toughest application challenges
Original shows
Entertaining stories from the makers and leaders in enterprise tech
Products
- Red Hat Enterprise Linux
- Red Hat OpenShift
- Red Hat Ansible Automation Platform
- Cloud services
- See all products
Tools
- Training and certification
- My account
- Customer support
- Developer resources
- Find a partner
- Red Hat Ecosystem Catalog
- Red Hat value calculator
- Documentation
Try, buy, & sell
Communicate
About Red Hat
We’re the world’s leading provider of enterprise open source solutions—including Linux, cloud, container, and Kubernetes. We deliver hardened solutions that make it easier for enterprises to work across platforms and environments, from the core datacenter to the network edge.
Select a language
Red Hat legal and privacy links
- About Red Hat
- Jobs
- Events
- Locations
- Contact Red Hat
- Red Hat Blog
- Diversity, equity, and inclusion
- Cool Stuff Store
- Red Hat Summit