-
Notifications
You must be signed in to change notification settings - Fork 6.9k
Pull requests: huggingface/diffusers
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
feat(utils): optional aiter swizzle GEMM for ROCm MI + aiter
#13327
opened Mar 25, 2026 by
LiuYinfeng01
Loading…
Add Flux2 LoKR adapter support prototype with dual conversion paths
#13326
opened Mar 25, 2026 by
CalamitousFelicitousness
Loading…
5 of 6 tasks
[flux.2 LoRA] make lora training compatible with flux.2 klein kv
#13325
opened Mar 24, 2026 by
linoytsaban
•
Draft
feat: Add Modular Pipeline for Stable Diffusion 3 (SD3)
#13324
opened Mar 24, 2026 by
AlanPonnachan
Loading…
4 of 6 tasks
Add warning when using CUDA random generator (#13298)
#13309
opened Mar 22, 2026 by
r266-tech
Loading…
2 tasks done
fix: disable non-blocking tensor copies to MPS during model loading
#13308
opened Mar 22, 2026 by
agarwalprakhar2511
Loading…
fix(dreambooth): batch size mismatch with --with_prior_preservation in flux2 scripts
#13307
opened Mar 22, 2026 by
agarwalprakhar2511
Loading…
fix: skip pin_memory and non_blocking transfer for tensor subclasses in group offloading
#13305
opened Mar 22, 2026 by
s-zx
Loading…
fix: support torchao >= 0.16.0 by importing renamed CamelCase Config classes
#13304
opened Mar 22, 2026 by
s-zx
Loading…
fix: cast input to weight dtype in PatchEmbed and FluxTransformer2DModel to prevent dtype mismatch
#13303
opened Mar 21, 2026 by
s-zx
Loading…
fix(attention): download hub kernel in AttentionModuleMixin.set_attention_backend
#13302
opened Mar 21, 2026 by
s-zx
Loading…
Fix missing latents_bn_std dtype cast in VAE normalization
#13299
opened Mar 21, 2026 by
adi776borate
Loading…
3 of 6 tasks
remove str option for quantization config in torchao
#13291
opened Mar 19, 2026 by
howardzhang-cv
Loading…
[Neuron] Add AWS Neuron (Trainium/Inferentia) as an officially supported device
#13289
opened Mar 19, 2026 by
JingyaHuang
•
Draft
Add low precision attention API from torchao to TorchAoConfig
#13285
opened Mar 19, 2026 by
howardzhang-cv
•
Draft
2 tasks
Update attention_backends.md to update FA3 minimum support to Ampere
#13283
opened Mar 18, 2026 by
sayakpaul
Loading…
Previous Next
ProTip!
Find all pull requests that aren't related to any open issues with -linked:issue.