When set to True, Rank-Stabilized LoRA sets the adapter scaling factor tolora_alpha/math.sqrt(r), since it was proven to work better. Otherwise, it will use the original default value oflora_alpha/r. In equation form: Normal LoRA layers:W0X + (lora_alpha/r)(BAX)where W0 is the bas...