QuantStack/Wan2.2-T2V-A14B-GGUF

text to videoggufgguft2vtext-to-videobase_model:Wan-AI/Wan2.2-T2V-A14Bbase_model:quantized:Wan-AI/Wan2.2-T2V-A14Blicense:apache-2.0apache-2.0
180.6K

This GGUF file is a direct conversion of Wan-AI/Wan2.2-T2V-A14B

Since this is a quantized model, all original licensing terms and usage restrictions remain in effect.

Usage

The model can be used with the ComfyUI custom node ComfyUI-GGUF by city96

Place model files in ComfyUI/models/unet see the GitHub readme for further installation instructions.

DEPLOY IN 60 SECONDS

Run Wan2.2-T2V-A14B-GGUF on Runcrate

Deploy on H100, A100, or RTX GPUs. Pay only for what you use. No setup required.