打开configuration_chatglm.py 文件(通常位于 transformers_modules/4v/ 目录下)。 查找是否有你尝试访问的属性。如果没有,那么你需要确认你使用的属性名是否正确,或者该属性是否在新版本的模块中已被移除或重命名。 5. 根据错误信息和模块文档,修正代码中的错误 如果属性名错误,更正为正确的属性名。 如果属性在新版...
ModelScope:from configuration_chatglm import ChatGLMConfig the extra.makes HuggingFace version loadconfiguration_chatglm.pyfrom same folder, which is correct behavior. So if you downloaded model from ModelScope and have this error occurred, just change that line of code inmodeling_chatglm.py I've ...
1 task done No such file or directory: '/root/.cache/huggingface/modules/transformers_modules/chatglm-6b/configuration_chatglm.py'#1022 hexiaojin1314opened this issueMay 15, 2023· 12 comments Comments Traceback (most recent call last): ...
Star40.8k New issue Jump to bottom Closed wizdopened this issueMar 15, 2023· 13 comments Closed ValueError: Unrecognized configuration class <class 'transformers_modules.local.configuration_chatglm.ChatGLMConfig'> to build an AutoTokenizer.#37 ...
"ChatGLMModel" ], "attention_dropout": 0.0, "attention_softmax_in_fp32": true, "auto_map": { "AutoConfig": "configuration_chatglm.ChatGLMConfig", "AutoModel": "modeling_chatglm.ChatGLMForConditionalGeneration", "AutoModelForSeq2SeqLM": "modeling_chatglm.ChatGLMForConditionalGeneration" }...
Sign up Reseting focus New issue Jump to bottom Closed aliendanielopened this issueMar 15, 2023· 2 comments Closed ValueError: Unrecognized configuration class <class 'transformers_modules.local.configuration_chatglm.ChatGLMConfig'> to build an AutoTokenizer.#53 ...
Current Behavior OSError: Can't load the configuration of './output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-3000'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure './output/adge...