From da8e1fe7e4d87cbe203bc67b6b960857abe110d4 Mon Sep 17 00:00:00 2001 From: Artiprocher Date: Mon, 3 Mar 2025 14:19:16 +0800 Subject: [PATCH] support sage attention --- examples/wanvideo/README.md | 7 +++++++ 1 file changed, 7 insertions(+) diff --git a/examples/wanvideo/README.md b/examples/wanvideo/README.md index ecfc536..4972c26 100644 --- a/examples/wanvideo/README.md +++ b/examples/wanvideo/README.md @@ -10,6 +10,13 @@ cd DiffSynth-Studio pip install -e . ``` +Wan-Video supports multiple Attention implementations. If you have installed any of the following Attention implementations, they will be enabled based on priority. + +* [Flash Attention 3](https://github.com/Dao-AILab/flash-attention) +* [Flash Attention 2](https://github.com/Dao-AILab/flash-attention) +* [Sage Attention](https://github.com/thu-ml/SageAttention) +* [torch SDPA](https://pytorch.org/docs/stable/generated/torch.nn.functional.scaled_dot_product_attention.html) (default. `torch>=2.5.0` is recommended.) + ## Inference ### Wan-Video-1.3B-T2V