BERT-JEPA: Reorganizing CLS Embeddings for Language-Invariant Semantics

Reading time: 1 minute
...

📝 Original Info

  • Title: BERT-JEPA: Reorganizing CLS Embeddings for Language-Invariant Semantics
  • ArXiv ID: 2601.00366
  • Date: 2026-01-01
  • Authors: Taj Gillin, Adam Lalani, Kenneth Zhang, Marcel Mateos Salles

📝 Abstract

Joint Embedding Predictive Architectures (JEPA) are a novel self supervised training technique that have shown recent promise across domains. We introduce BERT-JEPA (BEPA), a training paradigm that adds a JEPA training objective to BERT-style models, working to combat a collapsed [CLS] embedding space and turning it into a language-agnostic space. This new structure leads to increased performance across multilingual benchmarks.

📄 Full Content

...(본문 내용이 길어 생략되었습니다. 사이트에서 전문을 확인해 주세요.)

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut