.github Cookbook assets cn api ecosystem opensource Inference fine_tune function_calling local local-llama.cpp.md local-lm-studio.md local-mlx.md local-ollama.md quantization rag CHANGELOG.md CONTRIBUTING.md CONTRIBUTING_cn.md README.md Events README VL assets demo docs finetune quantization ....
YiAiQuickDeploy 修复左侧菜单问题 整合包重新打包 Feb 16, 2024 admin 整合包sh命令,可自行整合 Feb 5, 2024 chat 修复左侧菜单问题 整合包重新打包 Feb 16, 2024 service 修复左侧菜单问题 整合包重新打包 Feb 16, 2024 .DS_Store 修复左侧菜单问题 整合包重新打包 Feb 16, 2024 ...
本项目基于nineai 2.4.2 二次开发。 本项目仅供学习及参考,不可商用,由此造成的后果概不负责。 整合包位于YiAiQuickDeploy目录。 API中转商推荐:ChatfireAPI, 支持国内外热门大模型。 Yi - Ai 更新日志 V2.6.0(20240705) 功能优化 更新模型列表,新增gpt-4o、claude3.5等热门模型: ...
1.在service(后端服务)目录修改.env数据库信息,然后运行终端命令:pnpm install 安装依赖文件,再运行:pnpm dev 调试并导入数据库,最后使用终端命令:pnpm build 编译打包,最后就能得到与上面整合版一样的文件。 2.chat(用户前端)和admin(管理员后端)仅需修改各种目录.env中的后端接口即可,然后分别在终端中执行pnpm in...
git clone https://github.com/01-ai/Yi.git cd yi pip install -r requirements.txt Step 2: Download the Yi model You can download the weights and tokenizer of Yi models from the following sources: Hugging Face ModelScope WiseModel Step 3: Perform inference You can perform inference with Yi...
fromtransformersimportAutoModelForCausalLM,AutoTokenizermodel_path='01-ai/Yi-34b-Chat'tokenizer=AutoTokenizer.from_pretrained(model_path,use_fast=False)# Since transformers 4.35.0, the GPT-Q/AWQ model can be loaded using AutoModelForCausalLM.model=AutoModelForCausalLM.from_pretrained(model_path,de...
Reminder I have searched the Github Discussion and issues and have not found anything similar to this. Environment - OS: ubuntu20.04 - Python:3.10.12 - PyTorch:2.1.0 - CUDA:12.4. - torch.version.cuda '12.1' Current Behavior RuntimeError:...
#6 opened Nov 5, 2023 by ZhaoFancy 1 task ProTip! Exclude everything labeled bug with -label:bug. Footer © 2024 GitHub, Inc. Footer navigation Terms Privacy Security Status Docs Contact Manage cookies Do not share my personal information ...
ZhaoFancy changed the title Debug github action output Sync docker image to public registry Nov 14, 2023 debug only 26cbb39 ZhaoFancy had a problem deploying to internal November 14, 2023 08:08 — with GitHub Actions Error ZhaoFancy temporarily deployed to internal November 14, 2023 08:08...
fromtransformersimportAutoModelForCausalLM,AutoTokenizermodel_path='01-ai/Yi-34b-Chat'tokenizer=AutoTokenizer.from_pretrained(model_path,use_fast=False)# Since transformers 4.35.0, the GPT-Q/AWQ model can be loaded using AutoModelForCausalLM.model=AutoModelForCausalLM.from_pretrained(model_path,de...