have you tried updating transformers withpip install --upgrade transformers? Satyam Prasad Tiwari Posted2 years ago rafaDDD Topic Author Posted2 years ago Yeah, but failed anyway. Then I tried using "BertTokenizer" instead of "Autotokenizer", still failed, however, the tracebacks are different....
(NLP) tasks, including the GLUE Benchmark and SQuAD Question Answering dataset. This model is based on the BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding paper and Hugging Face implementation, leveraging mixed precision arithmetic and Tensor Cores on V100 GPUs for ...
Disclaimer: The team releasing BERT did not write a model card for this model so this model card has been written by the Hugging Face team. Model description BERT is a transformers model pretrained on a large corpus of multilingual data in a self-supervised fashion. This means it was pretrai...