1、配置了学长的环境,并编写点处理(旋转、镜像)和保存(由内存numpy数组到shp文件)脚本。 2、初步阅读mesh-gpt论文,思考Transformer网络架构(翻译模型和补全模型的训练区别) 环境配置# pip install torch==2.1.2torchvision==0.16.2torchaudio==2.1.2--index-url https://download.pytorch.org/whl/cu118 pip inst...
url='https://github.com/lucidrains/meshgpt-pytorch', keywords=[ 'artificial intelligence', 'deep learning', 'attention mechanisms', 'transformers', 'mesh generation' ], install_requires=[ 'einops>=0.7.0', 'torch>=2.0', 'vector-quantize-pytorch', ...
2 changes: 1 addition & 1 deletion 2 meshgpt_pytorch/version.py Original file line numberDiff line numberDiff line change @@ -1 +1 @@ __version__ = '1.5.9' __version__ = '1.5.10'2 changes: 1 addition & 1 deletion 2 setup.py Original file line numberDiff line numberDiff line...
pip install torch-scatter -f https://data.pyg.org/whl/torch-2.1.0+cu118.html pip install torch==2.1.0 torchvision==0.16.0 torchaudio==2.1.0 --index-url https://download.pytorch.org/whl/cu118 pip install packaging pip install -r requirements.txt ...
Hi, Thank you for your great implementation of the meshGPT paper. I have a question related to the section 3.1 : "For sequence ordering, Polygen [43] suggests a convention where faces are ordered based on their lowest vertex index, follo...
Implementation of MeshGPT, SOTA Mesh generation using Attention, in Pytorch - try to figure out issue with gh actions by lucidrains · Pull Request #85 · lucidrains/meshgpt-pytorch