CivArchive
    7PAG [bf16/fp16] [no-ema-no-vae] [SafeTensors] [Checkpoint] - 7PAG-LSCX
    NSFW
    Preview 2994922
    Preview 2994921
    Preview 2994924
    Preview 2994925
    Preview 2995342
    Preview 2995343
    Preview 2995344
    Preview 2995345
    Preview 2995362
    Preview 2995364
    Preview 2995376
    Preview 2995377
    Preview 2995428
    Preview 2995429
    Preview 2995430
    Preview 2995431
    Preview 2995406
    Preview 2995412
    Preview 2995410
    Preview 2995411

    7PAG [bf16/fp16] [no-ema-no-vae] [SafeTensors] [Checkpoint]

    =====================

    Disclaimer:

    I'm just a Script kiddie. I have no idea what i'm doing.

    So, keep calm and know nothing.

    =====================

    All the credit goes to LotusSnacks.

    https://civarchive.com/user/LotusSnacks

    https://civarchive.com/models/17143/7pag-nsfw-merged-model

    Description

    7PAG-LSCX

    FAQ

    Comments (3)

    daxivi6026Oct 17, 2023
    CivitAI

    Which one is better for inference, bf16 or fp16? I googled it but only found out that bf16 is more stable during training. Do the old GPUs support the bf16 checkpoints?

    LeaderThree
    Author
    Oct 18, 2023· 1 reaction

    New GPUs: 30X0/40X0 (Ampere/Ada Lovelace). Yep, BF16 is more stable.

    Old GPUs: 10X0/20X0 (Pascal/Turing). Just use FP16.

    There is no significant difference between BF16 and FP16.

    So, if you don't care, just use FP16.

    daxivi6026Oct 18, 2023

    @LeaderThree Thanks a lot, that clears it up for me 👍

    Checkpoint
    SD 1.5

    Details

    Downloads
    689
    Platform
    CivitAI
    Platform Status
    Available
    Created
    10/17/2023
    Updated
    5/13/2026
    Deleted
    -

    Files

    7pagBf16Fp16NoEmaNoVae_7pagLSCX.safetensors

    7pagBf16Fp16NoEmaNoVae_7pagLSCX.safetensors