Bge-multilingual-gemma2

Embeddings

BGE-Multilingual-Gemma2 is a LLM-based multilingual embedding model. It is trained on a diverse range of languages and tasks. BGE-Multilingual-Gemma2 primarily demonstrates the following advancements: Diverse training data: The model's training data spans a broad range of languages, including English, Chinese, Japanese, Korean, French, and more.Additionally, the data covers a variety of task types, such as retrieval, classification, and clustering. Outstanding performance: The model exhibits state-of-the-art (SOTA) results on multilingual benchmarks like MIRACL, MTEB-pl, and MTEB-fr. It also achieves excellent performance on other major evaluations, including MTEB, C-MTEB and AIR-Bench.

Informazioni sul modello Bge-multilingual-gemma2

Pubblicato il huggingface

29/06/2024


Prezzo di input

0.01 /Mtoken(input)


Dimensioni del contesto
Sconosciuto
Parametri
0.567B

Prova il modello giocando con esso.