Lexicon: Simple-Lexicon:Simplify the Usage of Lexicon in Chinese NER FLAT: FLAT: Chinese NER Using Flat-Lattice Transformer MRC: A Unified MRC Framework for Named Entity Recognition 参考 keras与tensorflow版本对应: docs.floydhub.com/guide BERT-NER-Pytorch: github.com/lonePatient/ bert4keras: github...
Lexicon: Simple-Lexicon:Simplify the Usage of Lexicon in Chinese NER FLAT: FLAT: Chinese NER Using Flat-Lattice Transformer MRC: A Unified MRC Framework for Named Entity Recognition 文本摘要(TS, Text-Summary) BertSum: Fine-tune BERT for Extractive Summarization 参考 This library is inspired by...
必填 pretrained_model_name_or_path = "bert-base-chinese" # 训练-验证语料地址, 可以只输入训练地址 path_corpus = os.path.join(path_root, "corpus", "sequence_labeling", "ner_china_people_daily_1998_conll") path_train = os.path.join(path_corpus, "train.conll") path_dev = os.path.join...
支持BERT、ERNIE、ROBERTA、NEZHA、ALBERT、XLNET、ELECTRA、GPT-2、TinyBERT、XLM、T5等预训练模型; 支持BCE-Loss、Focal-Loss、Circle-Loss、Prior-Loss、Dice-Loss、LabelSmoothing等损失函数; 具有依赖轻量、代码简洁、注释详细、调试清晰、配置灵活、拓展方便、适配NLP等特性。