Researchers have released mmBERT, a multilingual version of the optimized ModernBERT architecture. This new model extends ModernBERT's efficient design to handle over 100 languages, promising faster training and inference for cross-lingual tasks. Early benchmarks show mmBERT achieves competitive performance on tasks like text classification and named entity recognition across diverse languages, while reducing computational costs compared to traditional multilingual BERT models. The model is available under an open-source license, aiming to democratize access to high-performance multilingual NLP.
mmBERT: Bringing ModernBERT's Efficiency to Multilingual NLP
AI
April 26, 2026 · 4:10 PM