DailyGlimpse

Fine-Tuning BERT with Contrastive Learning for Semantic Search

AI
April 27, 2026 · 3:31 PM

In the latest lecture of the Generative AI series, Constantine Caramanis explores how to train models like BERT on custom data to boost performance in semantic search tasks. The session covers Contrastive Learning, InfoNCE (Information Noise-Contrastive Estimation), and the construction of effective training datasets. The lecture also introduces bi-encoders and explains their training and usage.

Key concepts include:

  • BERT embeddings
  • InfoNCE
  • Contrastive loss
  • Similarity matrix
  • Cross-entropy

This is part of an ongoing course on Generative AI, which now includes 23 lessons.