Artificial intelligence (AI) is shaping the future of industries across the globe. Yet, the intricate mathematics and complex theories often associated with AI research can pose significant barriers to its broader adoption. That’s where Red Hat’s newest podcast, “No Math AI”,  comes in.

Hosted by Dr. Akash Srivastava, Red Hat chief architect and a pioneer of the InstructLab project, and Isha Puri, Massachusetts Institute of Technology PhD student in AI, “No Math AI” is a monthly podcast designed to make cutting-edge AI research more accessible. Whether you’re an AI practitioner, business leader or a tech enthusiast, this podcast offers insights into the real-world impact of AI advancements on business.

Each episode will break down crucial AI concepts and distill them into actionable takeaways. “No Math AI” makes it easier for enthusiasts and business leaders to understand and embrace AI and incorporate it into their strategy with confidence. 

Episode 1: Inference-time scaling and how small models beat the big ones

In the debut episode of “No Math AI”, Dr. Srivastava and Isha are joined by guest speakers and research engineers, Shivchander Sudalairaj, GX Xu and Kai Xu. Together, they dive into a crucial topic that’s making waves in AI performance: inference time scaling.

 Our hosts and guest speakers discuss how this technique is unlocking new levels of performance for AI models, enhancing reasoning capabilities, powering agentic AI and helping ensure higher accuracy.

Tune in to the episode below to discover how inference time scaling is transforming AI performance in real-world scenarios and how businesses can use it to stay ahead in the rapidly evolving AI landscape.

You can listen and watch the first episode on Spotify or the Red Hat YouTube channel. Don't forget to join our hosts each month for more insights into how AI can help shape the future of your business.

Recurso

Introducción a la inteligencia artificial para las empresas: Guía para principiantes

Acelere su proceso de adopción de la inteligencia artificial con Red Hat OpenShift AI y Red Hat Enterprise Linux AI. Obtenga más información al respecto en esta guía para principiantes.

Sobre el autor

Carlos Condado is a Senior Product Marketing Manager for Red Hat AI. He helps organizations navigate the path from AI experimentation to enterprise-scale deployment by guiding the adoption of MLOps practices and integration of AI models into existing hybrid cloud infrastructures. As part of the Red Hat AI team, he works across engineering, product, and go-to-market functions to help shape strategy, messaging, and customer enablement around Red Hat’s open, flexible, and consistent AI portfolio.

With a diverse background spanning data analytics, integration, cybersecurity, and AI, Carlos brings a cross-functional perspective to emerging technologies. He is passionate about technological innovations and helping enterprises unlock the value of their data and gain a competitive advantage through scalable, production-ready AI solutions.

UI_Icon-Red_Hat-Close-A-Black-RGB

Navegar por canal

automation icon

Automatización

Las últimas novedades en la automatización de la TI para los equipos, la tecnología y los entornos

AI icon

Inteligencia artificial

Descubra las actualizaciones en las plataformas que permiten a los clientes ejecutar cargas de trabajo de inteligecia artificial en cualquier lugar

open hybrid cloud icon

Nube híbrida abierta

Vea como construimos un futuro flexible con la nube híbrida

security icon

Seguridad

Vea las últimas novedades sobre cómo reducimos los riesgos en entornos y tecnologías

edge icon

Edge computing

Conozca las actualizaciones en las plataformas que simplifican las operaciones en el edge

Infrastructure icon

Infraestructura

Vea las últimas novedades sobre la plataforma Linux empresarial líder en el mundo

application development icon

Aplicaciones

Conozca nuestras soluciones para abordar los desafíos más complejos de las aplicaciones

Virtualization icon

Virtualización

El futuro de la virtualización empresarial para tus cargas de trabajo locales o en la nube