Goddess Project
--Formerly Uncensored Females--
Standalone Checkpoint - Goddess works in FORGE only
DO NOT LOAD a separate VAE, TE, or CLIP unless using GGUF
This version is a mixed precision Flux Dev model, with limited UNET changes to allow for feminine anatomy.
Run this model in automatic FP16 Lora mode NOT NF4
This model is full precision in BF16 UNET with mixed precision (NF4) on the TE blocks
This model fits in a 24GB card and could be run in GPU only mode as such
High speed BF16 with a slightly lower prompt accuracy compared to the 33GB full model
Links (For GGUF ONLY)
Updated CLIP - Standard CLIP-L - FP8 CLIP-L -- ** Version Comparison **
Per the Apache 2.0 license FLAN is attributed to Google
This model is a training using many individuals with known ages and 2257 forms, it has also been merged to try and ensure that no known individuals can be reproduced. However FLUX seems to like to learn faces even with less then 10% data rather then merge them into a new face.
Description
FAQ
Comments (10)
https://ibb.co/GV0QnM0
hey sorry to bother you again. i just run the workflow u put out but this is the result that i got when i ran it..
If your not getting errors in python, I am not sure what to recommend other then update comfy
@Felldude i dont get any error either when i ran it.. ill try my luck with other UI. your GGUF and NF4 models both works fine for me dont know why only this one decided not to :(
Speed Test, ok its for your nvidia card
you have tested GGUF? you will see its very slow ;)
For me works OK by itself, but with any Flux lora it makes blurry images.
For lora support I tend to use NF4 (FP16) Lora in forge
@JohnRohan Have you tried reducing the weight of the Lora?
@Kierkegaard420 yes and still blurry with any other loras I tried. Also using FP16 too.
It's even blurry when I use a simple pubic hair lora!!!
@JohnRohan I have tested with lora's in auto FP16 LORA the only mode it should be used in, with normal/euler the majority of images where fine however as with most NF4 the checkpoint can just fail do to the insane quantization
