Files
DiffSynth-Studio/diffsynth/diffusion
Mr-Neutr0n b68663426f fix: preserve sign of denominator in clamp to avoid inverting gradient direction
The previous .clamp(min=1e-6) on (sigma_ - sigma) flips the sign when
the denominator is negative (which is the typical case since sigmas
decrease monotonically). This would invert the target and cause
training divergence.

Use torch.sign(denom) * torch.clamp(denom.abs(), min=1e-6) instead,
which prevents division by zero while preserving the correct sign.
2026-02-11 21:04:55 +05:30
..
2025-12-04 16:33:07 +08:00
2026-02-03 13:06:44 +08:00
2025-12-04 16:33:07 +08:00