( File "/home/tony/AI/text_gen/oobabooga_linux_gpu/installer_files/env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/home/tony/AI/text_gen/oobabooga_linux_gpu/installer_files/env/lib/python...
Describe the bug try to clone form command line repo. Is there an existing issue for this? I have searched the existing issues Reproduction git clone https://github.com/oobabooga/text-generation-webui speed is very slow ~100k (on windows...
//github.com/oobabooga/torch-grammar.git /tmp/pip-req-build-aeyje2et ++ pwd + current_dir=/home/dewi/code/text-generation-webui + [[ /home/dewi/code/text-generation-webui == \/\m\n\t\/\c* ]] + /usr/bin/git clone --filter=blob:none --quiet https://github.com/oobabooga/...
ERROR: exllama-0.0.6+cu117-cp310-cp310-linux_x86_64.whl is not a supported wheel on this platform. and ERROR: auto_gptq-0.3.0+cu117-cp310-cp310-linux_x86_64.whl is not a supported wheel on this platform. The error messages happen quickly after attempting to install the wheels whic...
HappyCatKittenopened this issueMar 6, 2023· 3 comments HappyCatKittencommentedMar 6, 2023• edited This works on Linux with 3090 python server.py --load-in-8bit --model LLaMa-7B python server.py --model LLaMa-7B However this doesnt work. ...
Describe the bug not sure why. REinstalled cuda 11.7 (after using --uninstall as well as bin\cuda_uninstaller), and getting an error on latest commit when I try to pip install -r requirements.txt ERROR: llama_cpp_python_cuda-0.2.6+cu117-cp310-cp310-manylinux_2_31_x86_64.whl is not...