16GB-VRAM run Chat-UniVi-7B-v1.5 model? · Issue #37 · PKU...
Can 16GB-VRAM run Chat-UniVi-7B-v1.5 model? thanksMember jpthu17 commented May 22, 2024 16GB-VRAM doesn't seem like enough for training the model, you can try using Lora and reducing the batch size. For model inference, 16GB-VRAM is sufficient....