Just testing some ideas with Flux :)
For V4 pretty much as V3, 6-10 steps.
For V3 use Euler Beta or Simple (DPM++ 2M SGM uniform seems fine too).
8-10 steps
This is a Unet only, you need to add and load those https://huggingface.co/lllyasviel/flux_text_encoders/tree/main
If you use Compfy UI this node works for NF4 https://github.com/DenkingOfficial/ComfyUI_UNet_bitsandbytes_NF4
Description
1st version a bit of an experiment.
this one uses NF4.
FAQ
Comments (18)
Could you share your workflow? This isn't loading with either the "Load Diffusion Model" or "CheckpointLoaderNF4" for me.
I am using Forge https://github.com/lllyasviel/stable-diffusion-webui-forge , so there is not really a workflow.
I might make an FP8 version that should give you less trouble, I don't know how well NF4 is supported by Comfy.
Looking into this, too. Thought I had all possible loaders covered by now... 馃槄
@TiwazM Glad to see you experiment with Flux models, too!
@TiwazM聽Nope, same loader error
CheckpointLoaderNF4
ERROR: Could not detect model type of:
@MrHYD聽I will have a look later
@redpinkretro聽of course, how could I not ;)
@MrHYD聽ooh I think I know what might be the issue, it is the unet only so you need to out it in then unet folder ?
@TiwazM聽Just tried to load it with all the loader, regular checkpoint, specific BNB NF4 loader and the diffusers loader for UNET only models. None worked :/
@MrHYD聽I will make an FP8 tomorrow of the new version, sorry got distracted making this or I can see if I can include everything.
@TiwazM聽Thanks :-)
I'm excited to try it out
@MrHYD聽Still having a look but if you want to try NF4, this works for me https://github.com/DenkingOfficial/ComfyUI_UNet_bitsandbytes_NF4
@TiwazM聽Yep, switched to that fork and it loads. Thanks :-)
@MrHYD聽I just uploaded the FP8 but not sure if I don't prefer the NF4 in some ways
@TiwazM聽Yeah that one I had in use already and it has a complete checkpoint loader for NF4 AIO and one for NF4 UNet only, but I had stuff built into a group node without realizing I was trying the wrong one in differnent configs. My bad 馃槄
璇烽棶鐢ㄤ粈涔堝伐鍏峰惂Lora鍚堝苟鍒癴lux
https://github.com/bmaltais/kohya_ss but you need to use the flux branch that is under development









