Nodes Browser
ComfyDeploy: How ComfyUI-WAN-ClipSkip works in ComfyUI?
What is ComfyUI-WAN-ClipSkip?
Custom nodes for ComfyUI implementing the csm model for text-to-speech generation.
How to install it in ComfyDeploy?
Head over to the machine page
- Click on the "Create a new machine" button
- Select the
Edit
build steps - Add a new step -> Custom Node
- Search for
ComfyUI-WAN-ClipSkip
and select it - Close the build step dialig and then click on the "Save" button to rebuild the machine
ComfyUI-CLIPSkip
A custom node for ComfyUI that adds CLIP skip functionality to vanilla WAN workflow using CLIP. This node allows you to skip a specified number of layers in a CLIP model, which can adjust the style or quality of image embeddings in generation pipelines.
Installation
Via Git (Recommended)
- Open a terminal in your ComfyUI
custom_nodes
directory:
cd ComfyUI/custom_nodes
- Clone this repository:
git clone https://github.com/yourusername/ComfyUI-CLIPSkip.git
-
Restart ComfyUI. The node will be automatically loaded.
-
Ensure you have the required WAN CLIP model (e.g., umt5_xxl_fp8_e4m3fn_scaled.safetensors) in ComfyUI/models/text_encoders/.
Manual Installation
- Download this repository as a ZIP file.
- Extract it into the
ComfyUI/custom_nodes
directory. - Rename the folder to
ComfyUI-CLIPSkip
if needed. - Restart ComfyUI.
Dependencies
- ComfyUI (latest version recommended)
- PyTorch (installed with ComfyUI)
No additional dependencies are required.
Usage
- Load a CLIP Vision model using
CLIPVisionLoader
or any other node that outputsCLIP_VISION
. - Connect the
clip_vision
output to theclip
input ofCLIPSkip
. - Set the
skip_layers
parameter (e.g., 1 to skip the last layer, 0 to disable skipping). - Connect the output
clip
to any node that acceptsCLIP_VISION
(e.g.,CLIPVisionEncode
).
Example Workflow
CLIPVisionLoader -> CLIPSkip -> CLIPVisionEncode -> (further pipeline)
Supported Models
- umt5_xxl_fp8_e4m3fn_scaled.safetensors (24 layers).
Notes
- Designed for WAN-type CLIP models in ComfyUI.
- Requires ComfyUI with FP8 support for optimal performance.
License
MIT License (see LICENSE
file for details).
Contributing
Feel free to submit issues or pull requests on GitHub!