Ryan Dick
|
f88c1ba0c3
|
Fix bug with some LoRA variants when applied to a BnB NF4 quantized model. Note the previous commit which added a unit test to trigger this bug.
|
2025-01-22 09:20:40 +11:00 |
|
Ryan Dick
|
2855bb6b41
|
Update BaseLayerPatch.get_parameters(...) to accept a dict of orig_parameters rather than orig_module. This will enable compatibility between patching and cpu->gpu streaming.
|
2024-12-28 21:12:53 +00:00 |
|
Ryan Dick
|
37e3089457
|
Push LoRA layer reshaping down into the patch layers and add a new FluxControlLoRALayer type.
|
2024-12-17 13:20:19 +00:00 |
|
Ryan Dick
|
fe09f2d27a
|
Move handling of LoRA scale and patch weight down into the layer patch classes.
|
2024-12-17 13:20:19 +00:00 |
|
Ryan Dick
|
2b441d6a2d
|
Add BaseLayerPatch ABC to clarify the intended patch interface.
|
2024-12-17 13:20:19 +00:00 |
|
Ryan Dick
|
8ea697d733
|
Mark LoRALayerBase.rank(...) as a private method.
|
2024-12-17 13:20:19 +00:00 |
|
Ryan Dick
|
693d42661c
|
Add basic unit tests for LoRALayer.
|
2024-12-17 13:20:19 +00:00 |
|
Ryan Dick
|
42f8d6aa11
|
Rename backend/lora/ to backend/patches
|
2024-12-17 13:20:19 +00:00 |
|