Ryan Dick
7e894ffe83
Consolidate InpaintExtension implementations for SD3 and FLUX.
2025-04-10 10:50:13 +10:00
psychedelicious
a44bfb4658
fix(mm): handle FLUX models w/ diff in_channels keys
...
Before FLUX Fill was merged, we didn't do any checks for the model variant. We always returned "normal".
To determine if a model is a FLUX Fill model, we need to check the state dict for a specific key. Initially, this logic was too strict and rejected quantized FLUX models. This issue was resolved, but it turns out there is another failure mode - some fine-tunes use a different key.
This change further reduces the strictness, handling the alternate key and also falling back to "normal" if we don't see either key. This effectively restores the previous probing behaviour for all FLUX models.
Closes #7856
Closes #7859
2025-03-31 12:32:55 +11:00
Ryan Dick
5ea3ec5cc8
Get FLUX Fill working. Note: To use FLUX Fill, set guidance to ~30.
2025-03-19 14:45:18 +11:00
Ryan Dick
4c02ba908a
Add support for FLUX Redux masks.
2025-03-06 10:31:17 +11:00
Ryan Dick
f1fde792ee
Get FLUX Redux working: model loading and inference.
2025-03-06 10:31:17 +11:00
Ryan Dick
d5211a8088
Add FluxRedux model type and probing logic.
2025-03-06 10:31:17 +11:00
Ryan Dick
b301785dc8
Normalize the T5 model identifiers so that a FLUX T5 or an SD3 T5 model can be used interchangeably.
2025-01-16 08:33:58 +11:00
Ryan Dick
607d19f4dd
We should not trust the value of since the model could be partially-loaded.
2025-01-07 19:22:31 -05:00
Ryan Dick
bcd29c5d74
Remove all cases where we check the 'model.device'. This is no longer trustworthy now that partial loading is permitted.
2025-01-07 00:31:00 +00:00
Ryan Dick
d84adfd39f
Clean up FLUX control LoRA pre-processing logic.
2024-12-17 07:28:45 -05:00
Brandon Rising
70811d0bd0
Remove unexpected artifacts in output images
2024-12-17 07:28:45 -05:00
Brandon Rising
046d19446c
Rename Structural Lora to Control Lora
2024-12-17 07:28:45 -05:00
Brandon Rising
f53da60b84
Lots of updates centered around using the lora patcher rather than changing the modules in the transformer model
2024-12-17 07:28:45 -05:00
Brandon Rising
5a035dd19f
Support bnb quantized nf4 flux models, Use controlnet vae, only support 1 structural lora per transformer. various other refractors and bugfixes
2024-12-17 07:28:45 -05:00
Brandon Rising
f3b253987f
Initial setup for flux tools control loras
2024-12-17 07:28:45 -05:00
Ryan Dick
0bff4ace1b
Revert performance improvement, because it caused flux inference to fail on Mac: https://github.com/invoke-ai/InvokeAI/issues/7422
2024-12-03 15:18:58 +00:00
Ryan Dick
021552fd81
Avoid unnecessary dtype conversions with rope encodings.
2024-11-29 12:32:50 -05:00
Ryan Dick
be73dbba92
Use view() instead of rearrange() for better performance.
2024-11-29 12:32:50 -05:00
Ryan Dick
db9c0cad7c
Replace custom RMSNorm implementation with torch.nn.functional.rms_norm(...) for improved speed.
2024-11-29 12:32:50 -05:00
Ryan Dick
5d8dd6e26e
Fix FLUX regional negative prompts.
2024-11-28 18:49:29 +00:00
Ryan Dick
64364e7911
Short-circuit if there are no region masks in FLUX and don't apply attention masking.
2024-11-27 22:40:10 +00:00
Ryan Dick
6565cea039
Comment unused _prepare_unrestricted_attn_mask(...) for future reference.
2024-11-27 22:16:44 +00:00
Ryan Dick
3ebd8d6c07
Delete outdated TODO comment.
2024-11-27 22:13:25 +00:00
Ryan Dick
e970185161
Tweak flux regional prompting attention scheme based on latest experimentation.
2024-11-27 22:13:07 +00:00
Ryan Dick
b54463d294
Allow regional prompting background regions to attend to themselves and to the entire txt embedding.
2024-11-26 17:57:31 +00:00
Ryan Dick
faee79dc95
Distinguish between restricted and unrestricted attn masks in FLUX regional prompting.
2024-11-26 16:55:52 +00:00
Ryan Dick
e01f66b026
Apply regional attention masks in the single stream blocks in addition to the double stream blocks.
2024-11-25 22:40:08 +00:00
Ryan Dick
53abdde242
Update Flux RegionalPromptingExtension to prepare both a mask with restricted image self-attention and a mask with unrestricted image self attention.
2024-11-25 22:04:23 +00:00
Ryan Dick
94c088300f
Be smarter about selecting the global CLIP embedding for FLUX regional prompting.
2024-11-25 20:15:04 +00:00
Ryan Dick
3741a6f5e0
Fix device handling for regional masks and apply the attention mask in the FLUX double stream block.
2024-11-25 16:02:03 +00:00
Ryan Dick
2c23b8414c
Use a single global CLIP embedding for FLUX regional guidance.
2024-11-22 23:01:43 +00:00
Ryan Dick
20356c0746
Fixup the logic for preparing FLUX regional prompt attention masks.
2024-11-21 22:46:25 +00:00
Ryan Dick
bad1149504
WIP - add rough logic for preparing the FLUX regional prompting attention mask.
2024-11-20 22:29:36 +00:00
Ryan Dick
fda7aaa7ca
Pass RegionalPromptingExtension down to the CustomDoubleStreamBlockProcessor in FLUX.
2024-11-20 19:48:04 +00:00
Ryan Dick
85c616fa34
WIP - Pass prompt masks to FLUX model during denoising.
2024-11-20 18:51:43 +00:00
Ryan Dick
c6fc82f756
Infer the clip_extra_context_tokens param from the state dict for FLUX XLabs IP-Adapter V2 models.
2024-11-18 17:06:53 -08:00
David Burnett
9bd17ea02f
Get flux working with MPS on 2.4.1, with GGUF support
2024-10-23 10:20:42 +11:00
Ryan Dick
dde54740c5
Test out IP-Adapter with CFG.
2024-10-21 15:47:17 +00:00
Ryan Dick
fdccdd52d5
Fixes to get XLabsIpAdapterExtension running.
2024-10-21 15:43:00 +00:00
Ryan Dick
31ffd73423
Initial draft of integrating FLUX IP-Adapter inference support.
2024-10-21 15:42:56 +00:00
Ryan Dick
3fa1012879
Add IPAdapterDoubleBlocks wrapper to tidy FLUX ip-adapter handling.
2024-10-21 15:38:50 +00:00
Ryan Dick
c2a8fbd8d6
(minor) Move infer_xlabs_ip_adapter_params_from_state_dict(...) to state_dict_utils.py.
2024-10-21 15:38:50 +00:00
Ryan Dick
d6643d7263
Add model loading code for xlabs FLUX IP-Adapter (not tested).
2024-10-21 15:38:50 +00:00
Ryan Dick
f939dbdc33
Add is_state_dict_xlabs_ip_adapter() utility function.
2024-10-21 15:38:50 +00:00
Ryan Dick
24a0ca86f5
Add logic for loading an Xlabs IP-Adapter from a state dict.
2024-10-21 15:38:50 +00:00
Ryan Dick
95c30f6a8b
Add initial logic for inferring FLUX IP-Adapter params from a state_dict.
2024-10-21 15:38:50 +00:00
Ryan Dick
ac7441e606
Fixup typing/imports for IPDoubleStreamBlockProcessor.
2024-10-21 15:38:50 +00:00
Ryan Dick
9c9af312fe
Copy IPDoubleStreamBlockProcessor from 47495425db/src/flux/modules/layers.py (L221).
2024-10-21 15:38:50 +00:00
Ryan Dick
32c7cdd856
Add cfg_scale_start_step and cfg_scale_end_step to FLUX Denoise node.
2024-10-21 14:52:02 +00:00
Ryan Dick
6df4ee5fc8
Make negative_text_conditioning nullable on FLUX Denoise invocation.
2024-10-18 20:31:27 +00:00