I got it to work on Sdnext and it's pretty fast. But unfortunately, SDNext is not very compatible to run SDXL type of models / checkpoints on intel arc gpu. It can not load them because it needs more than 16GB. Somewhere on the net, I read that comfyui ...
If you have another UI installed and working with its own python venv you can use that venv to run ComfyUI. You can open up your favorite terminal and activate it: source path_to_other_sd_gui/venv/bin/activate or on Windows: With Powershell: "path_to_other_sd_gui\venv\Scripts\...
If you do not run ComfyUI locally, non-gpu instance such ast3.smallalso works. If you want to run FLUX.1 model, use at leastg5.2xlargeor above for fp8 version. use at leastg5.4xlargefor fp16 version. AWS Cloud9or local machine also work but make sure the followings are installed ...
Select an SDXL Turbo model in theStable Diffusion checkpointdropdown menu. We will use theDreamshaper SDXL Turbo model. Download the model and put it in the folderstable-diffusion-webui > models > Stable-Diffusion. Step 2: Enter the txt2img setting Go to thetxt2imgpage. You can control...