Original workflow found on Reddit, with some minor changes.
You can outpaint each side of the image independently.
Uses the promax version of controlnet++: https://huggingface.co/xinsir/controlnet-union-sdxl-1.0/tree/main
Description
FAQ
Comments (6)
man, where can I find 'SetUnionControlNetType' node?
Try to update ComfyUI to the latest version, it's a native node
the first, the one and only that wortks on the first try! thanx!!!!!!!
Without VAE Encode (for inpainting) - its work much much better
i don't know why
Thanks for posting this! It definitely works.
I did have a tricky scene with a pool, vegetation, chairs, tiles, etc. and modified my copy of the workflow to eliminate the horror show:
Improvements:
1) Limit end_percent of Apply ControlNet node, sometimes setting it to 0.3 is enough to guide the image and then let the sampler work unguided. Just this one change is huge.
2) Chain a second Apply ControlNet node, set the first one to crazy strength (like 5.00) but only apply to 0.1 percent of the run. Let the second one have normal 1.00 strength but apply from 0.1 to 0.6 percent of the run.
3) VAE Encode node needs to grow_mask_by something more than 0 or obvious seams occured. A value of 32 or even 64 gave much better results than 16 or less.
Things that didn't work for me:
1) Feathering: no combination of image/mask feathering improved things. The note in the workflow was 💯 percent accurate.
2) VAE Encode nodes that don't take in the mask parameter (AKA: can't use Tiled VAE Encode)
3) Using a checkpoint designed for inpainting. Using the CyberRealisticPony_v125 I normally use gave much better results.
Other thoughts: I got great results expanding an image made with Flux, so mixing models definitely works.
Very nice setup. Working very well from the 1st run. Thjank you!


