REPO地址:https://github.com/qhduan/bert-model Roberta和WMM来自ymcui
——以Flat-lattice 增强的SikuBERT 预训练模型为例* 谢靖,刘江峰,王东波 摘要标注古代中医文献的命名实体能挖掘其蕴藏的中医学知识,推进中医现代化发展。文章基于BERT-base 、RoBERTa 、SikuBERT 、SikuRoBERTa 预训练模型,以《黄帝内经·素问》为研究对象、Flat-lattice Transformer (FLAT )结构为微调模型,构建...
{ "version": "2.3.3", "resolved": "https://registry.npmmirror.com/fsevents/-/fsevents-2.3.3.tgz", "integrity": "sha512-5xoDfX+fL7faATnagmWPpbFtwh/R77WmMMqqHGS65C3vvB0YHrgF+B1YmZ3441tMj5n63k0212XNoJwzlhffQw==", "hasInstallScript": true, "optional": true, "os": [ "...