In a move that underscores the growing importance of open-source AI in the enterprise, Hugging Face and IBM have announced a collaboration on watsonx.ai, a next-generation studio designed to help AI builders train, validate, tune, and deploy models—from traditional machine learning to cutting-edge generative AI.
Today's enterprises face a challenge: no single model fits all needs. Companies want to choose the best model for each use case while maximizing relevance on their own data and optimizing computing costs. Privacy and intellectual property remain top concerns, driving the need for control over model training and deployment.
As AI spreads across departments, organizations expect to run hundreds or even thousands of models simultaneously. Given the rapid pace of AI innovation, models will need to be replaced more frequently, reinforcing the need for a streamlined pipeline from training to production.
Standardization and automation are key. Fortunately, recent developments have paved the way:
- Model standardization: The Transformer architecture has become the de facto standard for deep learning applications such as NLP, computer vision, and audio processing.
- Pre-trained models: Hundreds of thousands of models are now available on platforms like Hugging Face, allowing users to quickly test and shortlist promising candidates.
- Open-source libraries: Hugging Face libraries let users download models with a single line of code and start experimenting in minutes, with consistent tools that work across laptops and production environments.
Cloud partnerships have further simplified scaling. Following collaborations with AWS (Amazon SageMaker) and Microsoft (Azure Machine Learning), Hugging Face is now working with IBM on watsonx.ai. Built on Red Hat OpenShift, watsonx.ai will be available both in the cloud and on-premises—a boon for customers with strict compliance requirements who previously had to build custom ML platforms.
Under the hood, watsonx.ai integrates Hugging Face's open-source libraries, including Transformers, Accelerate, PEFT, and Text Generation Inference. This native integration lets Hugging Face customers work seamlessly with their models and datasets on the IBM platform.
Additionally, IBM plans to release its own collection of large language models as open source, making them available on the Hugging Face Hub.
The collaboration was announced during IBM THINK 2023, where Dr. Darío Gil (SVP and Director of IBM Research) and Hugging Face CEO Clem Delangue demonstrated the platform and discussed the partnership.
"We're thrilled to work with IBM on watsonx.ai," said a Hugging Face spokesperson. "The most iconic technology companies joining forces with an up-and-coming startup to tackle AI in the enterprise—who would have thought?"