build [DEBUG] - Command line arguments: --build_dir /home/caracal/onnx/onnxruntime/build/Linux --use_cuda --cudnn_home /usr/local/cuda --cuda_home /usr/local/cuda --config RelWithDebInfo --build_shared_lib --parallel 2022-09-29 17:16:00,446 build [DEBUG] - Defaulting to ...
speculative_config=None, tokenizer='/models/mistralai/Mixtral-8x7B-Instruct-v0.1', skip_tokenizer_init=False, tokenizer_mode=auto, revision=None, rope_scaling=None, rope_theta=None, tokenizer_revision=None, trust
Requirement already satisfied: protobuf>=3.20.2in/home/user/mambaforge/envs/tensorml/lib/python3.11/site-packages (from onnx>=1.12.0->-r requirements.txt (line 8)) (4.25.1) Requirement already satisfied: tensorrt-libs==9.1.0.post12.dev4in/home/user/mambaforge/envs/tensorml/lib/python3....
max_seq_len_to_capture=8192, disable_custom_all_reduce=False, tokenizer_pool_size=0, tokenizer_pool_type='ray', tokenizer_pool_extra_config=None, limit_mm_per_prompt=None, mm_processor_kwargs=None, enable_lora=False, enable_lora_bias=False, max_loras=1, max_lora_rank=16, lora_extra_...
I am adapting a large language model for NPU. Since the opt model is only a basic model and cannot complete the dialogue function, I have to adapt other models. I have followed code: @echo off python run_onnx.py --model_name %1 --onnx op...
Open Description toniedeng Here's my python env for grounded-sam, you can check the package version: Package Version --- --- absl-py 1.4.0 addict 2.4.0 aiofiles 23.1.0 aiohttp 3.8.4 aiosignal 1.3.1 altair 4.2.2 antlr4-python3-runtime 4.9.3 anyio 3.6.2 asttokens 2.0.8 async-time...
I try to compile deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B to mlir with the following script. # Import necessary libraries import torch from transformers import AutoModelForCausalLM, AutoTokenizer from torch.export import export import onnx from torch_mlir import fx # Load the DeepSeek model ...
[torch]>=4.37.0## fix transformer engine tp-comm-overlapRUNpip uninstall -y onnx && pip install --no-cache-dir onnx==1.16.1RUNapt-get update && apt-get install -y openssh-server && \ echo"Port 54321">> /etc/ssh/sshd_config && \ echo"PermitRootLogin yes">> /etc/ssh/sshd_...
Solution to issue cannot be found in the documentation. I checked the documentation. Issue Can't update mamba. This is the error I'm getting mamba upgrade --all Download error (23) Failed writing received data to disk/application [https:...
tokenizer.save_pretrained(SAVE_DIR)``` ### Before submitting a new issue... - [X] Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the [documentation page](https://docs.vllm.ai/en/latest/), which can answer lots of freque...