the 1st character make me learn more about lora training.1st time i use AdamW8it and thought it was just another easy boring training,but i found that it doesnt even match to training image.so i try a few more times and using diferent settings or images,but the results were same.2 days ago, i found a important video on bilibili:
【全网最细的秋叶训练器参数讲解,查漏补缺,LoRA训练,调度器函数,优化器,AdamW8bit,lion,DA,D-Adaptation,prodigy神童优化器】 https://www.bilibili.com/video/BV1a3iYY3ETn/?share_source=copy_web&vd_source=62b7af7b4367facd37340c57ca6b1e27
it explains meaning of every settings,and then the biggest transform of my lora training was using Prodigy as my optimizer.it improves my lora quality hugely.from then on, all my loras r using Prodigy until i find a better optimizer
AdamW8bit:

