psychedelicious
d63ff560d6
feat(ui): number collection generator supports floats
2025-01-17 12:19:04 +11:00
psychedelicious
acceac8304
fix(ui): do not set number collection field to undefined when removing last item
2025-01-17 12:19:04 +11:00
psychedelicious
96671d12bd
fix(ui): filter out batch nodes when checking readiness on workflows tab
2025-01-17 12:19:04 +11:00
psychedelicious
584601d03f
perf(ui): memoize selector in workflows
2025-01-17 12:19:04 +11:00
psychedelicious
b1c4ec0888
feat(ui): rough out number generators for number collection fields
2025-01-17 12:19:04 +11:00
psychedelicious
db5f016826
fix(nodes): allow batch datum items to mix ints and floats
...
Unfortunately we cannot do strict floats or ints.
The batch data models don't specify the value types, it instead relies on pydantic parsing. JSON doesn't differentiate between float and int, so a float `1.0` gets parsed as `1` in python.
As a result, we _must_ accept mixed floats and ints for BatchDatum.items.
Tests and validation updated to handle this.
Maybe we should update the BatchDatum model to have a `type` field? Then we could parse as float or int, depending on the inputs...
2025-01-17 12:19:04 +11:00
psychedelicious
c1fd28472d
fix(ui): float batch data creation
2025-01-17 12:19:04 +11:00
psychedelicious
0c5958675a
chore(ui): lint
2025-01-17 12:19:04 +11:00
psychedelicious
912e07f2c8
tidy(ui): use zod typeguard builder util for fields
2025-01-17 12:19:04 +11:00
psychedelicious
f853b24868
chore(ui): typegen
2025-01-17 12:19:04 +11:00
psychedelicious
4f900b22dc
feat(ui): validate number item multipleOf
2025-01-17 12:19:04 +11:00
psychedelicious
5823532941
feat(ui): validate string item lengths
2025-01-17 12:19:04 +11:00
psychedelicious
bfe6d98cba
feat(ui): support float batches
2025-01-17 12:19:04 +11:00
psychedelicious
c26b3cd54f
refactor(ui): abstract out helper to add batch data
2025-01-17 12:19:04 +11:00
psychedelicious
c012d832d2
fix(ui): typo
2025-01-17 12:19:04 +11:00
psychedelicious
9d11d2aabd
refactor(ui): abstract out field validators
2025-01-17 12:19:04 +11:00
psychedelicious
a5f1587ce7
feat(ui): add template validation for integer collection items
2025-01-17 12:19:04 +11:00
psychedelicious
0b26bb1ca3
feat(ui): add template validation for string collection items
2025-01-17 12:19:04 +11:00
psychedelicious
0f1e632117
feat(nodes): add float batch node
2025-01-17 12:19:04 +11:00
psychedelicious
b212332b3e
feat(ui): support integer batches
2025-01-17 12:19:04 +11:00
psychedelicious
90a91ff438
feat(nodes): add integer batch node
2025-01-17 12:19:04 +11:00
psychedelicious
b52b271dc4
feat(ui): support string batches
2025-01-17 12:19:04 +11:00
psychedelicious
e077fe8046
refactor(ui): streamline image field collection input logic, support multiple images w/ same name in collection
2025-01-17 12:19:04 +11:00
psychedelicious
368957b208
tweak(ui): image field collection input component styling
2025-01-17 12:19:04 +11:00
psychedelicious
27277e1fd6
docs(ui): improved comments for image batch node special handling
2025-01-17 12:19:04 +11:00
psychedelicious
236c0d89e7
feat(nodes): add string batch node
2025-01-17 12:19:04 +11:00
psychedelicious
b807170701
fix(ui): typo in error message for image collection fields
2025-01-17 12:19:04 +11:00
Ryan Dick
0cf51cefe8
Revise the logic for calculating the RAM model cache limit.
2025-01-16 23:46:07 +00:00
Ryan Dick
e5e848d239
Update config docstring.
2025-01-16 22:34:23 +00:00
Ryan Dick
da589b3f1f
Memory optimization to load state dicts one module at a time in CachedModelWithPartialLoad when we are not storing a CPU copy of the state dict (i.e. when keep_ram_copy_of_weights=False).
2025-01-16 17:00:33 +00:00
Ryan Dick
36a3869af0
Add keep_ram_copy_of_weights config option.
2025-01-16 15:35:25 +00:00
Ryan Dick
c76d08d1fd
Add keep_ram_copy option to CachedModelOnlyFullLoad.
2025-01-16 15:08:23 +00:00
Ryan Dick
04087c38ce
Add keep_ram_copy option to CachedModelWithPartialLoad.
2025-01-16 14:51:44 +00:00
Ryan Dick
b2bb359d47
Update the model loading logic for several of the large FLUX-related models to ensure that the model is initialized on the meta device prior to loading the state dict into it. This helps to keep peak memory down.
2025-01-16 02:30:28 +00:00
Mary Hipp
b57aa06d9e
take out AbortController logic and simplify dependencies
2025-01-16 09:39:32 +11:00
Mary Hipp
f856246c36
try removing abortcontroller
2025-01-16 09:39:32 +11:00
Mary Hipp
195df2ebe6
remove logic changes, keep logging
2025-01-16 09:39:32 +11:00
Mary Hipp
7b5cef6bd7
lint fix
2025-01-16 09:39:32 +11:00
Mary Hipp
69e7ffaaf5
add logging, remove deps
2025-01-16 09:39:32 +11:00
psychedelicious
993401ad6c
fix(ui): hide layer when previewing filter
...
Previously, when previewing a filter on a layer with some transparency or a filter that changes the alpha, the preview was rendered on top of the layer. The preview blended with the layer, which isn't right.
In this change, the layer is hidden during the preview, and when the filter finishes (having been applied or canceled - the two possible paths), the layer is shown.
Technically, we are hiding and showing the layer's object renderer's konva group, which contains the layer's "real" data.
Another small change was made to prevent a flash of empty layer, by waiting to destroy a previous filter preview image until the new preview image is ready to display.
2025-01-16 09:27:36 +11:00
psychedelicious
8d570dcffc
chore(ui): typegen
2025-01-16 09:27:36 +11:00
psychedelicious
3f70e947fd
chore: ruff
2025-01-16 09:27:36 +11:00
dunkeroni
157290bef4
add: size option for image noise node and filter
2025-01-16 09:27:36 +11:00
dunkeroni
b7389da89b
add: Noise filter on Canvas
2025-01-16 09:27:36 +11:00
dunkeroni
254b89b1f5
add: Blur filter option on canvas
2025-01-16 09:27:36 +11:00
dunkeroni
2b122d7882
add: image noise invocation
2025-01-16 09:27:36 +11:00
dunkeroni
ded9213eb4
trim blur splitting logic
2025-01-16 09:27:36 +11:00
dunkeroni
9d51eb49cd
fix: ImageBlurInvocation handles transparency now
2025-01-16 09:27:36 +11:00
dunkeroni
0a6e22bc9e
fix: ImagePasteInvocation respects transparency
2025-01-16 09:27:36 +11:00
Ryan Dick
b301785dc8
Normalize the T5 model identifiers so that a FLUX T5 or an SD3 T5 model can be used interchangeably.
2025-01-16 08:33:58 +11:00