Nodes Browser
ComfyDeploy: How Model and Checkpoint Loaders for NF4 and FP4 works in ComfyUI?
What is Model and Checkpoint Loaders for NF4 and FP4?
Nodes for loading both Checkpoints and UNET/Diffussion models quantized to bitsandbytes NF4 or FP4 format. Still under development and some limitations such as using LoRA might apply still.
How to install it in ComfyDeploy?
Head over to the machine page
- Click on the "Create a new machine" button
- Select the
Edit
build steps - Add a new step -> Custom Node
- Search for
Model and Checkpoint Loaders for NF4 and FP4
and select it - Close the build step dialig and then click on the "Save" button to rebuild the machine
~~NOTE: This is very likely Deprecated in favor of GGUF which seems to give better results~~
Some users can experience speedup by combining loading UNET as NF4 using the loader from this repo and load T5XXL as GGUF using the repo from https://github.com/city96/ComfyUI-GGUF
Now on the manager for easy installation. Make sure to select Channel:dev in the ComfyUI manager menu or install via git url.
You can find the checkpoints and UNET in the linked repositories on huggingface or by searching for NF4 on Civitai
CivitAI search link
nf4 flux unet only
nf4 flux dev checkpoint
nf4 flux schnell checkpoint
Requires installing bitsandbytes.
Make sure your ComfyUI is updated.
The nodes are:
"CheckpointLoaderNF4": "Load NF4 Flux Checkpoint"
"UNETLoaderNF4": "Load NF4 Flux UNET"
just plug it in your flux workflow instead of the regular ones.
Code adapted from the implementation by Illyasviel at Forge.