psychedelicious
68571ece8f
tidy(app): remove unused methods
2025-07-04 20:35:29 +10:00
psychedelicious
4887424ca3
chore: ruff
2025-07-04 20:35:29 +10:00
Kent Keirsey
28f6a20e71
format import block
2025-07-04 20:35:29 +10:00
Kent Keirsey
c4142e75b2
fix import
2025-07-04 20:35:29 +10:00
Kent Keirsey
fefe563127
fix resizing and versioning
2025-07-04 20:35:29 +10:00
Kent Keirsey
e7ce08cffa
ruff format
2025-07-04 19:24:44 +10:00
Kent Keirsey
983cb5ebd2
ruff ruff
2025-07-04 19:24:44 +10:00
Kent Keirsey
52dbdb7118
ruff
2025-07-04 19:24:44 +10:00
Kent Keirsey
71e6f00e10
test fixes
...
fix
test
fix 2
fix 3
fix 4
yet another
attempt new fix
pray
more pray
lol
2025-07-04 19:24:44 +10:00
Kent Keirsey
51e1c56636
ruff
2025-06-27 18:27:46 +10:00
Kent Keirsey
ca1df60e54
Explain the Magic
2025-06-27 18:27:46 +10:00
Cursor Agent
7549c1250d
Add FLUX Kontext conditioning support for reference images
...
Co-authored-by: kent <kent@invoke.ai >
Fix Kontext sequence length handling in Flux denoise invocation
Co-authored-by: kent <kent@invoke.ai >
Fix Kontext step callback to handle combined token sequences
Co-authored-by: kent <kent@invoke.ai >
fix ruff
Fix Flux Kontext
2025-06-27 18:27:46 +10:00
Ryan Dick
7e894ffe83
Consolidate InpaintExtension implementations for SD3 and FLUX.
2025-04-10 10:50:13 +10:00
psychedelicious
a44bfb4658
fix(mm): handle FLUX models w/ diff in_channels keys
...
Before FLUX Fill was merged, we didn't do any checks for the model variant. We always returned "normal".
To determine if a model is a FLUX Fill model, we need to check the state dict for a specific key. Initially, this logic was too strict and rejected quantized FLUX models. This issue was resolved, but it turns out there is another failure mode - some fine-tunes use a different key.
This change further reduces the strictness, handling the alternate key and also falling back to "normal" if we don't see either key. This effectively restores the previous probing behaviour for all FLUX models.
Closes #7856
Closes #7859
2025-03-31 12:32:55 +11:00
Ryan Dick
5ea3ec5cc8
Get FLUX Fill working. Note: To use FLUX Fill, set guidance to ~30.
2025-03-19 14:45:18 +11:00
Ryan Dick
4c02ba908a
Add support for FLUX Redux masks.
2025-03-06 10:31:17 +11:00
Ryan Dick
f1fde792ee
Get FLUX Redux working: model loading and inference.
2025-03-06 10:31:17 +11:00
Ryan Dick
d5211a8088
Add FluxRedux model type and probing logic.
2025-03-06 10:31:17 +11:00
Ryan Dick
b301785dc8
Normalize the T5 model identifiers so that a FLUX T5 or an SD3 T5 model can be used interchangeably.
2025-01-16 08:33:58 +11:00
Ryan Dick
607d19f4dd
We should not trust the value of since the model could be partially-loaded.
2025-01-07 19:22:31 -05:00
Ryan Dick
bcd29c5d74
Remove all cases where we check the 'model.device'. This is no longer trustworthy now that partial loading is permitted.
2025-01-07 00:31:00 +00:00
Ryan Dick
d84adfd39f
Clean up FLUX control LoRA pre-processing logic.
2024-12-17 07:28:45 -05:00
Brandon Rising
70811d0bd0
Remove unexpected artifacts in output images
2024-12-17 07:28:45 -05:00
Brandon Rising
046d19446c
Rename Structural Lora to Control Lora
2024-12-17 07:28:45 -05:00
Brandon Rising
f53da60b84
Lots of updates centered around using the lora patcher rather than changing the modules in the transformer model
2024-12-17 07:28:45 -05:00
Brandon Rising
5a035dd19f
Support bnb quantized nf4 flux models, Use controlnet vae, only support 1 structural lora per transformer. various other refractors and bugfixes
2024-12-17 07:28:45 -05:00
Brandon Rising
f3b253987f
Initial setup for flux tools control loras
2024-12-17 07:28:45 -05:00
Ryan Dick
0bff4ace1b
Revert performance improvement, because it caused flux inference to fail on Mac: https://github.com/invoke-ai/InvokeAI/issues/7422
2024-12-03 15:18:58 +00:00
Ryan Dick
021552fd81
Avoid unnecessary dtype conversions with rope encodings.
2024-11-29 12:32:50 -05:00
Ryan Dick
be73dbba92
Use view() instead of rearrange() for better performance.
2024-11-29 12:32:50 -05:00
Ryan Dick
db9c0cad7c
Replace custom RMSNorm implementation with torch.nn.functional.rms_norm(...) for improved speed.
2024-11-29 12:32:50 -05:00
Ryan Dick
5d8dd6e26e
Fix FLUX regional negative prompts.
2024-11-28 18:49:29 +00:00
Ryan Dick
64364e7911
Short-circuit if there are no region masks in FLUX and don't apply attention masking.
2024-11-27 22:40:10 +00:00
Ryan Dick
6565cea039
Comment unused _prepare_unrestricted_attn_mask(...) for future reference.
2024-11-27 22:16:44 +00:00
Ryan Dick
3ebd8d6c07
Delete outdated TODO comment.
2024-11-27 22:13:25 +00:00
Ryan Dick
e970185161
Tweak flux regional prompting attention scheme based on latest experimentation.
2024-11-27 22:13:07 +00:00
Ryan Dick
b54463d294
Allow regional prompting background regions to attend to themselves and to the entire txt embedding.
2024-11-26 17:57:31 +00:00
Ryan Dick
faee79dc95
Distinguish between restricted and unrestricted attn masks in FLUX regional prompting.
2024-11-26 16:55:52 +00:00
Ryan Dick
e01f66b026
Apply regional attention masks in the single stream blocks in addition to the double stream blocks.
2024-11-25 22:40:08 +00:00
Ryan Dick
53abdde242
Update Flux RegionalPromptingExtension to prepare both a mask with restricted image self-attention and a mask with unrestricted image self attention.
2024-11-25 22:04:23 +00:00
Ryan Dick
94c088300f
Be smarter about selecting the global CLIP embedding for FLUX regional prompting.
2024-11-25 20:15:04 +00:00
Ryan Dick
3741a6f5e0
Fix device handling for regional masks and apply the attention mask in the FLUX double stream block.
2024-11-25 16:02:03 +00:00
Ryan Dick
2c23b8414c
Use a single global CLIP embedding for FLUX regional guidance.
2024-11-22 23:01:43 +00:00
Ryan Dick
20356c0746
Fixup the logic for preparing FLUX regional prompt attention masks.
2024-11-21 22:46:25 +00:00
Ryan Dick
bad1149504
WIP - add rough logic for preparing the FLUX regional prompting attention mask.
2024-11-20 22:29:36 +00:00
Ryan Dick
fda7aaa7ca
Pass RegionalPromptingExtension down to the CustomDoubleStreamBlockProcessor in FLUX.
2024-11-20 19:48:04 +00:00
Ryan Dick
85c616fa34
WIP - Pass prompt masks to FLUX model during denoising.
2024-11-20 18:51:43 +00:00
Ryan Dick
c6fc82f756
Infer the clip_extra_context_tokens param from the state dict for FLUX XLabs IP-Adapter V2 models.
2024-11-18 17:06:53 -08:00
David Burnett
9bd17ea02f
Get flux working with MPS on 2.4.1, with GGUF support
2024-10-23 10:20:42 +11:00
Ryan Dick
dde54740c5
Test out IP-Adapter with CFG.
2024-10-21 15:47:17 +00:00