Feature request When training LoRA with PEFT, if there is a learnable embedding or parameter variables in the model, PEFT won't add additional trainable parameters for them. Maybe trainable parameter for nn.embedding() and nn.Parameter()...
in k else k weights_lora[k_new] = weights_all[k] else: if any([n in k for n in trainable_params]): weights_trainable[k[17:]] = weights_all[k] adapter_model = os.path.join(path, "adapter_model.bin") trainable_params = os.path.join(path, "trainable_params.bin") if not os...
zhangguangshanopened this issueSep 15, 2023· 1 comment zhangguangshancommentedSep 15, 2023 ms2.0镜像,进行train_dreambooth_lora脚本测试,报错:AssertionError: Only lora params 348 should be trainable. but got 628 trainable params。 SamitHuangassignedwtominSep 18, 2023...
A proof-of-concept project that showcases the potential for using small, locally trainable LLMs to create next-generation documentation tools. - gamcorn/ue5-llama-lora
(args): path = args.checkpoint_path trainable_params = args.trainable_params.split(",") weights_all = torch.load(os.path.join(path, "pytorch_model.bin")) weights_trainable = {} weights_lora = {} for k in weights_all: if "lora" in k: k_new = k.replace("default.", "") if ...