Uses Impact wildcard prompt to build prompts for an automated "photoshoot" from a local or private OpenAI compatible endpoint. If you don't like that and just want to use WAN2.2, Pusa, MPS, HPS2.1 together bypass it and use static queries or impact wildcard prompt alone.
Description
Changes for this version:
Cleaned up required custom nodes
Fixed wrong defaults
Improved the LLM System prompt
Simplified workflow design
Swapped the previous LLM nodes to Tara for stability and to cut down on timeouts.
FAQ
Comments (2)
thanks for sharing. I ended up replacing tara for ollama nodes which is more widely used and it's local too. Only minor issue is that videos and characters tend to have this plastic shiny look to them and also I've noticed a lot of mutations with the original settings.
Awesome thanks for the feedback. I've noticed the shine and mutations as well. I'm working isolating both in a newer version. Some of the mutations come from the prompt and some from HPS and MPS. I currently running a second pass of the LLM for prompt cohesion and it has helped a lot. The shine issue I'm still working on, its a combo of scheduler, and the loras