max_position_embeddings #8 Open jasonzou opened this issue Aug 21, 2024· 1 comment Commentsjasonzou commented Aug 21, 2024 多谢!学到不少。有一个问题,您的model的 https://github.com/AI-Study-Han/Zero-Chatgpt/blob/d19e74bc3d2f15c743c084fb6949232a17b040d0/pretrain/model/config.json#...
* Baichuan2-13B does not have max_position_embeddings in config see https://huggingface.co/baichuan-inc/Baichuan2-13B-Chat/blob/main/config.json Signed-off-by: Wang, Yi A <yi.a.wang@intel.com> * Update server/text_generation_server/models/flash_causal_lm.py Co-authored-by: Danië...
Attention__init__ File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1709, in __getattr__ raise AttributeError(f"'{type(self).__name__}' object has no attribute '{name}'") AttributeError: 'Qwen2Attention' object has no attribute 'max_position_embeddings'...
Previously, max position embeddings was missing from the config and thus set to 8192 by default, causing generation issue when current context window is over 8192. This PR hotfixes this issue. cc@patrickvonplaten@simon-mo Co-authored-by: Woosuk Kwonwoosuk.kwon@berkeley.edu PR Checklist (Click...