https://github.com/SnowMasaya/text-generation-webui/tree/deepl/extensions/deepl_translateoobabotAnother Discord bot, with both command-line and GUI modes. Easy setup, lots of config options, and customizable characters!oobabot -- command-line mode, uses Oobabooga's API module oobabot-plugin ...
I'm getting a similar problem using the built-in API and the Api tester modal that shows after clicking on the "Use via API" in the webui footer. Api is enabled in the settings: Error message I'm getting from theserver.py:
() File "/mount/chonky-files/oobabooga/installer_files/env/lib/python3.10/http/server.py", line 421, in handle_one_request method() File "/mount/chonky-files/oobabooga/text-generation-webui/extensions/api/blocking_api.py", line 82, in do_POST generator = generate_chat_reply( TypeError:...
(func, *args) File "F:\WBC\text-generation-webui\text-generation-webui\server.py", line 76, in load_lora_wrapper add_lora_to_model(selected_lora) File "F:\WBC\text-generation-webui\text-generation-webui\modules\LoRA.py", line 34, in add_lora_to_model shared.model = PeftModel....
(*self._args,**self._kwargs)File"/home/monster/Desktop/oobabooga_linux/text-generation-webui/extensions/api/blocking_api.py", line 72,in_run_serverserver = ThreadingHTTPServer((address,port),Handler)File "/home/monster/Desktop/oobabooga_linux/installer_files/env/lib/python3.10/socketserver....
Describe the bug After update API was not working, throwing error when loading extension/openai that sse_starlette not available. Since I'm too lazy to fork and to PR: diff --git a/requirements.txt b/requirements.txt index 62bf22b..1410a...
webui/installer_files/env/lib/python3.11/site-packages/gradio/blocks.py", line 1786, in process_api result = await self.call_function( ^^^ File "/media/nabab/NVME2/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/blocks.py", line 1350, in call_function pr...
/text-generation-webui/installer_files/env/lib/python3.11/site-packages/accelerate/utils/imports.py:245: UserWarning: Intel Extension for PyTorch 2.0 needs to work with PyTorch 2.0.*, but PyTorch 2.1.0 is found. Please switch to the matching version and run again. 👍 1 Hub...
The problem started when you changed the int8 multiplication to something unsupported. The fix for that is too slow so I have been using 0.4.2. I am unable to use the --rwkv-cuda-on option, as I get this error: File "E:\oobabooga\text-generation-webui\server.py", line 241, in...
Cancel Submit feedback Saved searches Use saved searches to filter your results more quickly Cancel Create saved search Sign in Sign up Reseting focus {{ message }} oobabooga / text-generation-webui Public Notifications You must be signed in to change notification settings Fork 5....