Hi 您好,我根据您的代码,对 whisper-large-v3-turbo 这个模型进行编译部署,报错如下,我看 24.09-trtllm-python-py3 支持的 tensorrt-llm 是0.13.0.您那边测试是成功的吗? Traceback (most recent call last): File "/workspace/TensorRT-LLM/examples/whisper/convert_checkpoint.py", line 24, in <module> ...
Use the image nvcr.io/nvidia/tritonserver:24.05-trtllm-python-py3 Try to start the backend. You will get the error + '[' 1 -eq 0 ']' + command=serve + export DATADIR=/data + DATADIR=/data + export TRTDIR=/data/git_TensorRT-LLM + TRTDIR=/data/git_TensorRT-LLM + export MIXTR...