Commit Graph

91 Commits

Author SHA1 Message Date
psychedelicious
767ac91f2c fix(nodes): revert unnecessary version bump 2025-07-04 20:35:29 +10:00
Kent Keirsey
fefe563127 fix resizing and versioning 2025-07-04 20:35:29 +10:00
Kent Keirsey
52dbdb7118 ruff 2025-07-04 19:24:44 +10:00
Kent Keirsey
71e6f00e10 test fixes
fix

test

fix 2

fix 3

fix 4

yet another

attempt new fix

pray

more pray

lol
2025-07-04 19:24:44 +10:00
Jonathan
83d642ed15 Update flux_denoise.py
Fixed version to 4.0.0
2025-06-30 11:28:02 +10:00
Jonathan
455c73235e Update flux_denoise.py
Updated version, removed WithBoard and WithMetadata
2025-06-30 11:28:02 +10:00
psychedelicious
9066dc1839 tidy(nodes): remove extraneous comments & add useful ones 2025-06-27 18:27:46 +10:00
Kent Keirsey
ca1df60e54 Explain the Magic 2025-06-27 18:27:46 +10:00
Cursor Agent
7549c1250d Add FLUX Kontext conditioning support for reference images
Co-authored-by: kent <kent@invoke.ai>

Fix Kontext sequence length handling in Flux denoise invocation

Co-authored-by: kent <kent@invoke.ai>

Fix Kontext step callback to handle combined token sequences

Co-authored-by: kent <kent@invoke.ai>

fix ruff

Fix Flux Kontext
2025-06-27 18:27:46 +10:00
Ryan Dick
7e894ffe83 Consolidate InpaintExtension implementations for SD3 and FLUX. 2025-04-10 10:50:13 +10:00
psychedelicious
9ca071819b chore(nodes): remove beta/prototype flag from a lot of stable nodes 2025-03-27 08:08:44 +11:00
Billy
182580ff69 Imports 2025-03-26 12:55:10 +11:00
Ryan Dick
9cc2232b6f Bump FluxDenoise invocation version and typegen. 2025-03-19 14:45:18 +11:00
Ryan Dick
9fdc06b447 Add FLUX Fill input validation and error/warning reporting. 2025-03-19 14:45:18 +11:00
Ryan Dick
5ea3ec5cc8 Get FLUX Fill working. Note: To use FLUX Fill, set guidance to ~30. 2025-03-19 14:45:18 +11:00
Ryan Dick
f13a07ba6a WIP on updating FluxDenoise to support FLUX Fill. 2025-03-19 14:45:18 +11:00
Ryan Dick
a913f0163d WIP - Add FluxFillInvocation 2025-03-19 14:45:18 +11:00
Jonathan
518a7c941f Changed version of FluxDenoiseInvocation
A Redux field was added but the node version wasn't updated.
2025-03-07 07:33:31 +11:00
Ryan Dick
4c02ba908a Add support for FLUX Redux masks. 2025-03-06 10:31:17 +11:00
Ryan Dick
f1fde792ee Get FLUX Redux working: model loading and inference. 2025-03-06 10:31:17 +11:00
Ryan Dick
607d19f4dd We should not trust the value of since the model could be partially-loaded. 2025-01-07 19:22:31 -05:00
Ryan Dick
71b97ce7be Reduce the likelihood of encountering https://github.com/invoke-ai/InvokeAI/issues/7513 by elminating places where the door was left open for this to happen. 2025-01-07 01:20:15 +00:00
Ryan Dick
6d7314ac0a Consolidate the LayerPatching patching modes into a single implementation. 2024-12-24 15:57:54 +00:00
Ryan Dick
80db9537ff Rename model_patcher.py -> layer_patcher.py. 2024-12-24 15:57:54 +00:00
Ryan Dick
61253b91f1 Enable LoRAPatcher.apply_smart_lora_patches(...) throughout the stack. 2024-12-24 15:57:54 +00:00
Mary Hipp
c154d833b9 raise error if control lora used with schnell 2024-12-18 10:19:28 -05:00
Ryan Dick
dd09509dbd Rename ModelPatcher -> LayerPatcher to avoid conflicts with another ModelPatcher definition. 2024-12-17 13:20:19 +00:00
Ryan Dick
7fad4c9491 Rename LoRAModelRaw to ModelPatchRaw. 2024-12-17 13:20:19 +00:00
Ryan Dick
b820862eab Rename ModelPatcher methods to reflect that they are general model patching methods and are not LoRA-specific. 2024-12-17 13:20:19 +00:00
Ryan Dick
c604a0956e Rename LoRAPatcher -> ModelPatcher. 2024-12-17 13:20:19 +00:00
Ryan Dick
41664f88db Rename backend/patches/conversions/ to backend/patches/lora_conversions/ 2024-12-17 13:20:19 +00:00
Ryan Dick
42f8d6aa11 Rename backend/lora/ to backend/patches 2024-12-17 13:20:19 +00:00
Ryan Dick
a4bed7aee3 Minor tidy of FLUX control LoRA implementation. (mostly documentation) 2024-12-17 07:28:45 -05:00
Ryan Dick
d84adfd39f Clean up FLUX control LoRA pre-processing logic. 2024-12-17 07:28:45 -05:00
Brandon Rising
046d19446c Rename Structural Lora to Control Lora 2024-12-17 07:28:45 -05:00
Brandon Rising
f53da60b84 Lots of updates centered around using the lora patcher rather than changing the modules in the transformer model 2024-12-17 07:28:45 -05:00
Brandon Rising
5a035dd19f Support bnb quantized nf4 flux models, Use controlnet vae, only support 1 structural lora per transformer. various other refractors and bugfixes 2024-12-17 07:28:45 -05:00
Brandon Rising
f3b253987f Initial setup for flux tools control loras 2024-12-17 07:28:45 -05:00
Jonathan
bb0ed5dc8a Update flux_denoise.py
Updated node version for FLUX Denoise.
2024-11-30 08:29:21 -05:00
Ryan Dick
53abdde242 Update Flux RegionalPromptingExtension to prepare both a mask with restricted image self-attention and a mask with unrestricted image self attention. 2024-11-25 22:04:23 +00:00
Ryan Dick
3741a6f5e0 Fix device handling for regional masks and apply the attention mask in the FLUX double stream block. 2024-11-25 16:02:03 +00:00
Ryan Dick
20356c0746 Fixup the logic for preparing FLUX regional prompt attention masks. 2024-11-21 22:46:25 +00:00
Ryan Dick
fda7aaa7ca Pass RegionalPromptingExtension down to the CustomDoubleStreamBlockProcessor in FLUX. 2024-11-20 19:48:04 +00:00
Ryan Dick
85c616fa34 WIP - Pass prompt masks to FLUX model during denoising. 2024-11-20 18:51:43 +00:00
Jonathan
2f6b035138 Update flux_denoise.py
Added a bool to allow the node user to add noise in to initial latents (default) or to leave them alone.
2024-11-07 08:44:10 -05:00
psychedelicious
61496fdcbc fix(nodes): load IP Adapter images as RGB
FLUX IP Adapter only works with RGB. Did the same for non-FLUX to be safe & consistent, though I don't think it's strictly necessary.
2024-10-23 08:34:15 +10:00
Ryan Dick
5cbe89afdd Merge branch 'main' into ryan/flux-ip-adapter-cfg-2 2024-10-22 21:17:36 +00:00
Ryan Dick
d20b894a61 Add cfg_scale_start_step and cfg_scale_end_step to FLUX Denoise node. 2024-10-23 07:59:48 +11:00
Ryan Dick
20362448b9 Make negative_text_conditioning nullable on FLUX Denoise invocation. 2024-10-23 07:59:48 +11:00
Ryan Dick
5df10cc494 Add support for cfg_scale list on FLUX Denoise node. 2024-10-23 07:59:48 +11:00