CivArchive
    Unsloth - FLUX.2-Klein-4B-GGUF (Distilled) - Q5_0
    NSFW
    Preview 121617866

    This is a GGUF quantized version of FLUX.2-klein-4B.
    unsloth/FLUX.2-klein-4B-GGUF uses Unsloth Dynamic 2.0 methodology for SOTA performance.

    • Important layers are upcasted to higher precision.

    • Uses tooling from ComfyUI-GGUF by city96.

    Description

    FAQ

    Comments (13)

    AltairTheArcMar 18, 2026
    CivitAI

    Do you know why is error like FluxParams.__init__() missing 1 required positional argument: 'vec_in_dim' appearing? :/ its both for safetensor and gguf models when it comes to flux2 and i never found answer to the problem. If anyone knows something please let me know.

    ccollins
    Author
    Mar 18, 2026

    Are you using comfyui?

    AltairTheArcMar 18, 2026

    @ccollins Yes

    ccollins
    Author
    Mar 18, 2026

    @AltairTheArc make sure your running the most recent version and that any nodes you have are also running their most recent version.

    AltairTheArcMar 18, 2026

    @ccollins mm...i'm using 0.4.76 which is from Sep 30, 2025

    ccollins
    Author
    Mar 18, 2026· 1 reaction

    @AltairTheArc That is out of date which is probably the issue. The current desktop install version is 0.8.23

    ccollins
    Author
    Mar 21, 2026· 1 reaction

    @AltairTheArc Did you have any luck with updating?

    AltairTheArcMar 21, 2026

    @ccollins nope, couldn't update, got errors, probably driver related if to trust github, and since me gpu vintage and i cant upgrade driver, welp, can't help it. o7 ty for your help.

    ccollins
    Author
    Apr 2, 2026

    copy the workflow off the following image and paste it into comfyui:
    https://civitai.com/images/121710803
    just click on the comfyui: 5 nodes entry on civitai, then paste it into comfyui

    AltairTheArcApr 3, 2026· 1 reaction

    @ccollins I did that 1st try, (i mean i did this before you told me this now? like i did what you said on my own before you said it, bad english) I did copied your workflow, it did not worked, so i dissected it, it still did not worked so i got template workflow from comfyui for appropriate model, it did not worked either. PS: I also installed the required nodes.
    So far, flux2, is the only model, that i touched, that i cant get it to work.
    I succeded with sd1.0 sd1.5 sdxl, pony, noobai, illustrious, wan2.1 wan2.2, flux1, chroma, anima, sound models, voice models, text models, animation models, safetensor and gguf, like, what im trying to say, i know what im doing (i do not, who am i kiding) but i still can not get flux2 to work, be it safetensor, or gguf, and i have like the newest 18.2 comfyui portable at the moment. So i trully have no idea. I get all those Ais for like, knowledge? experience, i dont have much use for them, flux2 liekly is to heavy to get any value out of it, but it rubs me wrong way that i cant get it to work. i will probably because of it do try 3, someday, but well, yeah, ye.
    I mean
    Thank you for your help <3

    AltairTheArcApr 1, 2026
    CivitAI

    Got myself portable version working, so i could load the model, however i got now this error: RuntimeError: mat1 and mat2 shapes cannot be multiplied (512x1024 and 7680x3072) - Eh, i was using Qwen3-0.6b.safetensors loaded into normal Clip loader set to Flux2 device.
    Do i have to use .gguf clip? I will rant but honestly, those flux models are more trouble than they probably worth. Ehhh. Anima droped out, i got it working in 3 hours, Flux2 exsists, months passes, and yet i have zero successfully working Fluxes2.

    AltairTheArcApr 1, 2026

    If that helps anyone here is error log:
    File "I:\zStableDifusionComfyPortable\ComfyUI\execution.py", line 525, in execute output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "I:\zStableDifusionComfyPortable\ComfyUI\execution.py", line 334, in get_output_data return_values = await asyncmap_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "I:\zStableDifusionComfyPortable\ComfyUI\execution.py", line 308, in asyncmap_node_over_list await process_inputs(input_dict, i) File "I:\zStableDifusionComfyPortable\ComfyUI\execution.py", line 296, in process_inputs result = f(**inputs) File "I:\zStableDifusionComfyPortable\ComfyUI\comfy_api\internal\__init__.py", line 149, in wrapped_func return method(locked_class, **inputs) File "I:\zStableDifusionComfyPortable\ComfyUI\comfy_api\latest\_io.py", line 1764, in EXECUTE_NORMALIZED to_return = cls.execute(*args, **kwargs) File "I:\zStableDifusionComfyPortable\ComfyUI\comfy_extras\nodes_custom_sampler.py", line 963, in execute samples = guider.sample(noise.generate_noise(latent), latent_image, sampler, sigmas, denoise_mask=noise_mask, callback=callback, disable_pbar=disable_pbar, seed=noise.seed) File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\samplers.py", line 1052, in sample output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed, latent_shapes=latent_shapes) File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\patcher_extension.py", line 112, in execute return self.original(*args, **kwargs) ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^ File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\samplers.py", line 995, in outer_sample output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed, latent_shapes=latent_shapes) File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\samplers.py", line 981, in inner_sample samples = executor.execute(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar) File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\patcher_extension.py", line 112, in execute return self.original(*args, kwargs) ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^ File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\samplers.py", line 751, in sample samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, self.extra_options) File "I:\zStableDifusionComfyPortable\python_embeded\Lib\site-packages\torch\utils\_contextlib.py", line 124, in decorate_context return func(*args, **kwargs) File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\k_diffusion\sampling.py", line 205, in sample_euler denoised = model(x, sigma_hat s_in, *extra_args) File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\samplers.py", line 400, in call out = self.inner_model(x, sigma, model_options=model_options, seed=seed) File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\samplers.py", line 954, in call return self.outer_predict_noise(*args, **kwargs) ~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^ File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\samplers.py", line 961, in outer_predict_noise ).execute(x, timestep, model_options, seed) ~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\patcher_extension.py", line 112, in execute return self.original(*args, **kwargs) ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^ File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\samplers.py", line 964, in predict_noise return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed) File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\samplers.py", line 380, in sampling_function out = calc_cond_batch(model, conds, x, timestep, model_options) File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\samplers.py", line 205, in calc_cond_batch return calccond_batch_outer(model, conds, x_in, timestep, model_options) File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\samplers.py", line 213, in calccond_batch_outer return executor.execute(model, conds, x_in, timestep, model_options) ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\patcher_extension.py", line 112, in execute return self.original(*args, kwargs) ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^ File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\samplers.py", line 325, in calccond_batch output = model.apply_model(input_x, timestep_, c).chunk(batch_chunks) ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^ File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\model_base.py", line 171, in apply_model return comfy.patcher_extension.WrapperExecutor.new_class_executor( ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ...<2 lines>... comfy.patcher_extension.get_all_wrappers(comfy.patcher_extension.WrappersMP.APPLY_MODEL, transformer_options) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ).execute(x, t, c_concat, c_crossattn, control, transformer_options, **kwargs) ~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\patcher_extension.py", line 112, in execute return self.original(*args, kwargs) ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^ File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\model_base.py", line 210, in applymodel model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, extra_conds) File "I:\zStableDifusionComfyPortable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1776, in wrappedcall_impl return self._call_impl(*args, **kwargs) ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^ File "I:\zStableDifusionComfyPortable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1787, in callimpl return forward_call(*args, kwargs) File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\ldm\flux\model.py", line 345, in forward return comfy.patcher_extension.WrapperExecutor.new_class_executor( ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ...<2 lines>... comfy.patcher_extension.get_all_wrappers(comfy.patcher_extension.WrappersMP.DIFFUSION_MODEL, transformer_options) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ).execute(x, timestep, context, y, guidance, ref_latents, control, transformer_options, kwargs) ~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\patcher_extension.py", line 112, in execute return self.original(*args, **kwargs) ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^ File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\ldm\flux\model.py", line 406, in forward out = self.forwardorig(img, img_ids, context, txt_ids, timestep, y, guidance, control, timestep_zero_index=timestep_zero_index, transformer_options=transformer_options, attn_mask=kwargs.get("attention_mask", None)) File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\ldm\flux\model.py", line 182, in forward_orig txt = self.txt_in(txt) File "I:\zStableDifusionComfyPortable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1776, in wrappedcall_impl return self._call_impl(*args, **kwargs) ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^ File "I:\zStableDifusionComfyPortable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1787, in callimpl return forward_call(*args, **kwargs) File "I:\zStableDifusionComfyPortable\ComfyUI\comfy\ops.py", line 392, in forward return self.forward_comfy_cast_weights(*args, **kwargs) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^ File "I:\zStableDifusionComfyPortable\ComfyUI\custom_nodes\ComfyUI-GGUF-main\ops.py", line 215, in forward_comfy_cast_weights out = self.forward_ggml_cast_weights(input, args, *kwargs) File "I:\zStableDifusionComfyPortable\ComfyUI\custom_nodes\ComfyUI-GGUF-main\ops.py", line 244, in forward_ggml_cast_weights return torch.nn.functional.linear(input, weight, bias) ~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^

    AltairTheArcApr 1, 2026
    CivitAI

    Nope, still same error even with GGUF CLIP oh dear loooooord i hate flux

    Ed: Qwen3-0.6b.gguf

    Checkpoint
    Flux.2 Klein 4B

    Details

    Downloads
    18
    Platform
    CivitAI
    Platform Status
    Available
    Created
    2/19/2026
    Updated
    5/13/2026
    Deleted
    -

    Files

    unslothFLUX2Klein4BGGUF_q50.gguf