fromtransformersimportAutoModelForCausalLM,AutoTokenizermodel_path='01-ai/Yi-34b-Chat'tokenizer=AutoTokenizer.from_pretrained(model_path,use_fast=False)# Since transformers 4.35.0, the GPT-Q/AWQ model can be loaded using AutoModelForCausalLM.model=AutoModelForCausalLM.from_pretrained(model_path,de...
git clone https://github.com/01-ai/Yi.git cd yi pip install -r requirements.txt Step 2: Download the Yi model You can download the weights and tokenizer of Yi models from the following sources: Hugging Face ModelScope WiseModel Step 3: Perform inference You can perform inference with Yi...
git clone https://github.com/01-ai/Yi.git cd yi pip install -r requirements.txt Step 2: Download the Yi model You can download the weights and tokenizer of Yi models from the following sources: Hugging Face ModelScope WiseModel Step 3: Perform inference You can perform inference with Yi...
git clone https://github.com/01-ai/Yi.git cd yi pip install -r requirements.txt Step 2: Download the Yi model You can download the weights and tokenizer of Yi models from the following sources: Hugging Face ModelScope WiseModel Step 3: Perform inference You can perform inference with Yi...
hlp-ai/yimt hlp-ai/yimtPublic NotificationsYou must be signed in to change notification settings Fork1 Star5 main BranchesTags Code Folders and files Name Last commit message Last commit date Latest commit History 791 Commits asr mt ocr
本项目基于nineai 2.4.2 二次开发。本项目仅供学习及参考,不可商用,由此造成的后果概不负责。整合包位于 YiAiQuickDeploy 目录 Yi - Ai 更新日志 V2.5.1(20240205) 功能优化 优化title显示: 优化title显示,字多显示... 左侧优化: 新增对话,简化功能 搜索功能优化 左侧菜单栏缩小 V2.5.0(20240203) 功能更新 ...
A series of large language models trained from scratch by developers @01-ai - Build Docker Image · Workflow runs · 01-ai/Yi
本项目基于nineai 2.4.2 二次开发。 本项目仅供学习及参考,不可商用,由此造成的后果概不负责。 整合包位于YiAiQuickDeploy目录。 API中转商推荐:ChatfireAPI, 支持国内外热门大模型。 Yi - Ai 更新日志 V2.6.0(20240705) 功能优化 更新模型列表,新增gpt-4o、claude3.5等热门模型: ...
docker run -it --gpus all \ -v <your-model-path>: /models ghcr.io/01-ai/yi:latest Alternatively, you can pull the Yi Docker image from registry.lingyiwanwu.com/ci/01-ai/yi:latest. Step 2: Perform inference You can perform inference with Yi chat or base models as below. Perform...
fromtransformersimportAutoModelForCausalLM,AutoTokenizermodel_path='01-ai/Yi-34b-Chat'tokenizer=AutoTokenizer.from_pretrained(model_path,use_fast=False)# Since transformers 4.35.0, the GPT-Q/AWQ model can be loaded using AutoModelForCausalLM.model=AutoModelForCausalLM.from_pretrained(model_path,de...