Jordan
c669336d6b
Update lora_manager.py
2023-02-21 02:05:11 -07:00
Jordan
5529309e73
adjusting back to hooks, forcing to be last in execution
2023-02-21 01:34:06 -07:00
Jordan
49c0516602
change hook to override
2023-02-20 23:45:57 -07:00
Jordan
c1c62f770f
Merge branch 'main' into add_lora_support
2023-02-20 20:33:59 -08:00
Jordan
e2b6dfeeb9
Update generate.py
2023-02-20 21:33:20 -07:00
neecapp
3732af63e8
fix prompt
2023-02-20 23:06:05 -05:00
Lincoln Stein
4c2a588e1f
Merge branch 'main' into perf/lowmem_sequential_guidance
2023-02-20 22:40:31 -05:00
Jordan
de89041779
optimize functions for unloading
2023-02-20 17:02:36 -07:00
Jordan
488326dd95
Merge branch 'add_lora_support' of https://github.com/jordanramstad/InvokeAI into add_lora_support
2023-02-20 16:50:16 -07:00
Jordan
c3edede73f
add notes and adjust functions
2023-02-20 16:49:59 -07:00
Jordan
6e730bd654
Merge branch 'main' into add_lora_support
2023-02-20 15:34:52 -08:00
Jordan
884a5543c7
adjust loader to use a settings dict
2023-02-20 16:33:53 -07:00
Jordan
ac972ebbe3
update prompt setup so lora's can be loaded in other ways
2023-02-20 16:06:30 -07:00
Jordan
3c6c18b34c
cleanup suggestions from neecap
2023-02-20 15:19:29 -07:00
Lincoln Stein
833079140b
Merge branch 'main' into enhance/update-menu
2023-02-20 17:16:20 -05:00
Lincoln Stein
fd27948c36
Merge branch 'main' into perf/lowmem_sequential_guidance
2023-02-20 17:15:33 -05:00
Jordan
8f6e43d4a4
code cleanup
2023-02-20 14:06:58 -07:00
blessedcoolant
a30c91f398
Merge branch 'main' into bugfix/textual-inversion-training
2023-02-21 09:58:19 +13:00
Lincoln Stein
3fa1771cc9
Merge branch 'main' into perf/lowmem_sequential_guidance
2023-02-20 15:20:15 -05:00
Lincoln Stein
1d9845557f
reduced verbosity of embed loading messages
2023-02-20 15:18:55 -05:00
Lincoln Stein
47ddc00c6a
in textual inversion training, skip over non-image files
...
- Closes #2715
2023-02-20 14:44:10 -05:00
Lincoln Stein
0d22fd59ed
restore ability of textual inversion manager to read .pt files
...
- Fixes longstanding bug in the token vector size code which caused
.pt files to be assigned the wrong token vector length. These
were then tossed out during directory scanning.
2023-02-20 14:34:14 -05:00
neecapp
e744774171
Rewrite lora manager with hooks
2023-02-20 13:49:16 -05:00
Lincoln Stein
cf53bba99e
Merge branch 'main' into bugfix/save-intermediates
2023-02-20 12:51:53 -05:00
Lincoln Stein
ed4c8f6a8a
fix crash in CLI when --save_intermediates called
...
Fixes #2733
2023-02-20 12:50:32 -05:00
Jonathan
b21bd6f428
Fix crash on calling diffusers' prepare_attention_mask
...
Diffusers' `prepare_attention_mask` was crashing when we didn't pass in a batch size.
2023-02-20 11:12:47 -06:00
Kevin Turner
cb6903dfd0
Merge branch 'main' into perf/lowmem_sequential_guidance
2023-02-20 08:03:11 -08:00
blessedcoolant
58e5bf5a58
Merge branch 'main' into bugfix/embedding-compatibility-test
2023-02-21 04:09:18 +13:00
blessedcoolant
cc7733af1c
Merge branch 'main' into enhance/update-menu
2023-02-21 03:54:40 +13:00
Lincoln Stein
cfd897874b
Merge branch 'main' into perf/lowmem_sequential_guidance
2023-02-20 07:42:35 -05:00
Lincoln Stein
1249147c57
Merge branch 'main' into enhance/update-menu
2023-02-20 07:38:56 -05:00
Lincoln Stein
eec5c3bbb1
Merge branch 'main' into main
2023-02-20 07:38:08 -05:00
Jonathan
ca8d9fb885
Add symmetry to generation ( #2675 )
...
Added symmetry to Invoke based on discussions with @damian0815. This can currently only be activated via the CLI with the `--h_symmetry_time_pct` and `--v_symmetry_time_pct` options. Those take values from 0.0-1.0, exclusive, indicating the percentage through generation at which symmetry is applied as a one-time operation. To have symmetry in either axis applied after the first step, use a very low value like 0.001.
2023-02-20 07:33:19 -05:00
Jordan
096e1d3a5d
start of rewrite for add / remove
2023-02-20 02:37:44 -07:00
Kevin Turner
2dded68267
add --sequential_guidance option for low-RAM tradeoff
2023-02-19 21:21:14 -08:00
Lincoln Stein
172ce3dc25
correctly detect when an embedding is incompatible with the current model
...
- Fixed the test for token length; tested on several .pt and .bin files
- Also added a __main__ entrypoint for CLI.py, to make pdb debugging a bit
more convenient.
2023-02-19 22:30:57 -05:00
Kevin Turner
6c8d4b091e
dev(InvokeAIDiffuserComponent): mollify type checker's concern about the optional argument
2023-02-19 16:58:54 -08:00
Jordan
82e4d5aed2
change to new method to load safetensors
2023-02-19 17:33:24 -07:00
Kevin Turner
d0abe13b60
performance(InvokeAIDiffuserComponent): add low-memory path for calculating conditioned and unconditioned predictions sequentially
...
Proof of concept. Still needs to be wired up to options or heuristics.
2023-02-19 16:04:54 -08:00
Kevin Turner
aca9d74489
refactor(InvokeAIDiffuserComponent): rename internal methods
...
Prefix with `_` as is tradition.
2023-02-19 15:33:16 -08:00
Jonathan
d3c1b747ee
Fix behavior when encountering a bad embedding ( #2721 )
...
When encountering a bad embedding, InvokeAI was asking about reconfiguring models. This is because the embedding load error was never handled - it now is.
2023-02-19 14:04:59 +00:00
Jordan
afc8639c25
add pending support for safetensors with cloneofsimo/lora
2023-02-18 21:07:34 -07:00
Kevin Turner
671c5943e4
Merge remote-tracking branch 'origin/main' into api/add-trigger-string-retrieval
...
# Conflicts:
# ldm/generate.py
2023-02-18 17:44:59 -08:00
Lincoln Stein
d01b7ea2d2
remove debug statement & actually do merge
2023-02-18 11:19:06 -05:00
Lincoln Stein
4fa91724d9
fix conversion of checkpoints into incompatible diffusers models
...
- The checkpoint conversion script was generating diffusers models
with the safety checker set to null. This resulted in models
that could not be merged with ones that have the safety checker
activated.
- This PR fixes the issue by incorporating the safety checker into
all 1.x-derived checkpoints, regardless of user's nsfw_checker setting.
2023-02-18 11:07:38 -05:00
Damian Stewart
d5d2e1d7a3
Merge branch 'main' into fix/expected-torch-device
2023-02-18 15:23:08 +01:00
Jordan
141be95c2c
initial setup of lora support
2023-02-18 05:29:04 -07:00
Iman Karim
2bf2f627e4
Fix for issue #2707
2023-02-18 11:40:12 +01:00
blessedcoolant
11a70e9764
Merge branch 'main' into patch-14
2023-02-18 18:45:05 +13:00
Kevin Turner
6b702c32ca
fix(xformers): shush about not having Triton available.
...
It's not readily available on Windows and xformers only uses it on some very specific hardware anyway.
2023-02-17 17:41:27 -08:00