Commit Graph

21 Commits

Author SHA1 Message Date
Ryan Dick
6c919e1bca Handle DoRA layer device casting when model is partially-loaded. 2025-01-28 14:51:35 +00:00
Ryan Dick
5357d6e08e Rename ConcatenatedLoRALayer to MergedLayerPatch. And other minor cleanup. 2025-01-28 14:51:35 +00:00
Ryan Dick
7fef569e38 Update frontend graph building logic to support FLUX LoRAs that modify the T5 encoder weights. 2025-01-28 14:51:35 +00:00
Ryan Dick
e7fb435cc5 Update DoRALayer with a custom get_parameters() override that 1) applies alpha scaling to delta_v, and 2) warns if the base model is incompatible. 2025-01-28 14:51:35 +00:00
Ryan Dick
5d472ac1b8 Move quantized weight handling for patch layers up from ConcatenatedLoRALayer to CustomModuleMixin. 2025-01-28 14:51:35 +00:00
Ryan Dick
28514ba59a Update ConcatenatedLoRALayer to work with all sub-layer types. 2025-01-28 14:51:35 +00:00
Ryan Dick
409b69ee5d Fix typo in DoRALayer. 2025-01-28 14:51:35 +00:00
Ryan Dick
206f261e45 Add utils for loading FLUX OneTrainer DoRA models. 2025-01-28 14:51:35 +00:00
Ryan Dick
4f369e3dfb First draft of DoRALayer. Not tested yet. 2025-01-28 14:51:35 +00:00
Ryan Dick
f88c1ba0c3 Fix bug with some LoRA variants when applied to a BnB NF4 quantized model. Note the previous commit which added a unit test to trigger this bug. 2025-01-22 09:20:40 +11:00
Ryan Dick
2619ef53ca Handle device casting in ia2_layer.py. 2025-01-07 00:31:00 +00:00
Ryan Dick
2855bb6b41 Update BaseLayerPatch.get_parameters(...) to accept a dict of orig_parameters rather than orig_module. This will enable compatibility between patching and cpu->gpu streaming. 2024-12-28 21:12:53 +00:00
Ryan Dick
b272d46056 Enable ability to control the weight of FLUX Control LoRAs. 2024-12-17 13:36:10 +00:00
Ryan Dick
37e3089457 Push LoRA layer reshaping down into the patch layers and add a new FluxControlLoRALayer type. 2024-12-17 13:20:19 +00:00
Ryan Dick
fe09f2d27a Move handling of LoRA scale and patch weight down into the layer patch classes. 2024-12-17 13:20:19 +00:00
Ryan Dick
606d58d7db Add sidecar wrapper for FLUX RMSNorm layers to support SetParameterLayers used by FLUX structural control LoRAs. 2024-12-17 13:20:19 +00:00
Ryan Dick
808e3770d3 Remove AnyLoRALayer type definition in favor of using BaseLayerPatch base class. 2024-12-17 13:20:19 +00:00
Ryan Dick
2b441d6a2d Add BaseLayerPatch ABC to clarify the intended patch interface. 2024-12-17 13:20:19 +00:00
Ryan Dick
8ea697d733 Mark LoRALayerBase.rank(...) as a private method. 2024-12-17 13:20:19 +00:00
Ryan Dick
693d42661c Add basic unit tests for LoRALayer. 2024-12-17 13:20:19 +00:00
Ryan Dick
42f8d6aa11 Rename backend/lora/ to backend/patches 2024-12-17 13:20:19 +00:00