M3E (base)76856.9150.4763.9967.5259.3447.6857.79 M3E (large)102454.7550.4264.3068.2059.6648.8857.66 Multi. E5 (base)76861.6346.4967.0765.3554.3540.6856.21 Multi. E5 (large)102463.6648.4469.8967.3456.0048.2358.84 OpenAI-Ada-002153652.0043.3569.5664.3154.2845.6853.02 ...
M3E(Moka Massive Mixed Embedding):新的中文Embedding模型,使用千万级(2200w+)中文句对数据集进行训练, 支持异质文本,在文本分类和文本检索的任务上效果超过openai-ada-002模型。 BGE-BAAI General Embedding 说明 关联模型 bge-large-zh 下载:https://huggingface.co/BAAI/bge-large-zh 资料...
#"text2vec-bge-large-chinese": "shibing624/text2vec-bge-large-chinese", #"m3e-small": "moka-ai/m3e-small", "m3e-base": "D:\Langchain-Chatchat\models\moka-ai\m3e-base", #"m3e-large": "moka-ai/m3e-large", #"bge-small-zh": "BAAI/bge-small-zh", #"bge-base-zh": "B...
chinese-roberta-wwm-ext m3e-base 42.647 78.676 88.235 基础模型,图 - m3e-large 36.029 61.765 72.059 基础模型 - bge-small-zh 36.765 65.441 74.265 基础模型 - bge-base-zh 38.235 64.706 78.676 基础模型 - bge-large-zh 38.971 68.382 82.353 基础模型 chinese-macbert-base text2vec-base-chinese 40.44...