CivArchive
    AnimateDiff Motion Loras (zoom, pan, roll, tilt) Bundle - Zoom In
    Preview 2664715

    I did not train these models, I am merely posting here as well to disseminate further. Original download links are also provided.

    MotionLoRA and its model zoo, enabling camera movement controls! Please download the MotionLoRA models (74 MB per model, also available at Google Drive / HuggingFace) and save them to the models/MotionLoRA folder. Example:

    python -m scripts.animate --config configs/prompts/v2/5-RealisticVision-MotionLoRA.yaml

    https://github.com/guoyww/animatediff/

    Description

    FAQ

    Comments (1)

    MakiGirlSep 13, 2024
    CivitAI

    Hi!

    I'm getting errors no matter what I try! I saw one YouTube tutorial using them, followed the instructions and no....
    Was trying to find the answer for the last 2 days but no luck.
    So I try here, hopefully someone can help!

    So this is the error message I'm getting:

    ...
    loading network C:\Users\XY\stable-diffusion-webui\models\Lora\v2_lora_ZoomIn.ckpt: AssertionError
    Traceback (most recent call last):
    File "C:\Users\XY\stable-diffusion-webui\extensions-builtin\Lora\networks.py", line 321, in load_networks
    net = load_network(name, network_on_disk)
    File "C:\Users\XY\stable-diffusion-webui\extensions-builtin\Lora\networks.py", line 254, in load_network
    raise AssertionError(f"Could not find a module type (out of {', '.join([x.class.name for x in module_types])}) that would accept those keys: {', '.join(weights.w)}")
    AssertionError: Could not find a module type (out of ModuleTypeLora, ModuleTypeHada, ModuleTypeIa3, ModuleTypeLokr, ModuleTypeFull, ModuleTypeNorm, ModuleTypeGLora, ModuleTypeOFT) that would accept those keys: 0.motion_modules.0.temporal_transformer.transformer_blocks.0.attention_blocks.0.processor.to_q_lora.down.weight, 0.motion_modules.0.temporal_transformer.transformer_blocks.0.attention_blocks.0.processor.to_q_lora.up.weight, 0.motion_modules.0.temporal_transformer.transformer_blocks.0.attention_blocks.0.processor.to_k_lora.down.weight,
    ...

    Thank you!!