.You can also start KAI directly in horde mode by using the command line in the play.(sh|bat) file. Pass the arguments to start a KAI instance in cluster mode like so (Change "0000000000" to your own API KEY, if you have one.)...
args.hordekey = horde_apikey_var.get() args.hordeworkername = horde_workername_var.get() if sd_model_var.get() != "": args.sdmodel = sd_model_var.get() if sd_clamped_var.get()==1: args.sdclamped = True args.sdthreads = (0 if sd_threads_var.get()=="" else int(sd_thr...
matter if you want to use the free, fast power of Google Colab, your own high end graphics card, an online service you have an API key for (Like OpenAI or Inferkit) or if you rather just run it slower on your CPU you will be able to find a way to use KoboldAI that works for...
matter if you want to use the free, fast power of Google Colab, your own high end graphics card, an online service you have an API key for (Like OpenAI or Inferkit) or if you rather just run it slower on your CPU you will be able to find a way to use KoboldAI that works for...
AI-Horde-Worker @ 3e357f4 Worker main branch Sep 16, 2023 colab Promote Colabcpp Jan 2, 2024 cores Lua API fixes Dec 20, 2021 data Add warning about command line changes and new modular backend May 20, 2023 docker-cuda Merge branch 'main' into united ...
Topics AI DevOps Security Software Development View all Explore Learning Pathways White papers, Ebooks, Webinars Customer Stories Partners Open Source GitHub Sponsors Fund open source developers The ReadME Project GitHub community articles Repositories Topics Trending Collections Enterprise En...
"api/extra" 18 changes: 9 additions & 9 deletions 18 klite.embd Original file line numberDiff line numberDiff line change @@ -1,12 +1,12 @@ <!DOCTYPE html> <!-- Kobold Lite WebUI is a standalone WebUI for use with KoboldAI United, AI Horde, or koboldcpp. KoboldAI Lite ...
[0.0, 10000.0], smartcontext=False, noshift=False, bantokens=None, forceversion=0, nommap=False, usemlock=False, noavx2=False, debugmode=0, skiplauncher=False, hordeconfig=None, noblas=False, useclblast=None, usecublas=['mmq'], usevulkan=None, gpulayers=99, tensor_split=None, onready...
matter if you want to use the free, fast power of Google Colab, your own high end graphics card, an online service you have an API key for (Like OpenAI or Inferkit) or if you rather just run it slower on your CPU you will be able to find a way to use KoboldAI that works for...
matter if you want to use the free, fast power of Google Colab, your own high end graphics card, an online service you have an API key for (Like OpenAI or Inferkit) or if you rather just run it slower on your CPU you will be able to find a way to use KoboldAI that works for...