inputs = tokenizer.apply_chat_template([{"role": "user", "image": image, "content": query}], add_generation_prompt=True, tokenize=True, return_tensors="pt", return_dict=True) # chat mode inputs = inputs.to(device) model = AutoModelForCausalLM.from_pretrained( "/mnt/disk1/models/...
支持glm4v多模态模型 #335 Closed update 1ad8694 View details li-plus merged commit 0f7a8a9 into main Jul 25, 2024 13 checks passed li-plus deleted the glm4v branch July 25, 2024 04:32 Sign up for free to join this conversation on GitHub. Already have an account? Sign in to ...