4GB PONY (Full Checkpoint)
Custom CLIP is not Quantized
Custom UNET quantized to FP8 allowing for a balance of size and quality
Works in FORGE, Comfy-UI and Auto-1111
Works with LORA's
GGUF 4STEP (Example images at 6 Steps)
Note this was up-cast to FP32 before quantizing to my knowledge no FP32 versions of pony exist
4STEP USE at cfg 1.0-2.0
Some modification made to UNET
Requires Pony VAE and CLIP-G, CLIP-L
Requires GGUF support
GGUF (FP16/Q8)
Requires Pony VAE and CLIP-G, CLIP-L
Requires GGUF support
Description
FAQ
Comments (9)
Can you do the same to Illustrious, including quantized version?
Possibly, I haven't looked much into what was changed for Illustrious
if only there was a gguf version that supported LoRAs
in comfy ui has no problems, LoRAs work although you have to test which one is stronger and which one is not.
can someone give me there workflow because of some reason my sampler is not working with gguf files
dict' object has no attribute 'get_model_object' this is iam getting
Are you using the correct CLIP for Pony - I would recommend the FP32 CLIP or one of the Universal CLIP's that I have available
@Felldude i shift to full checkpoint its giving very good images
What are the configurations?
Details
Available On (1 platform)
Same model published on other platforms. May have additional downloads or version variants.



