Nodes Browser
ComfyDeploy: How ComfyUI_HF_Inference works in ComfyUI?
What is ComfyUI_HF_Inference?
Unofficial support for Hugging Face's hosted inference.
How to install it in ComfyDeploy?
Head over to the machine page
- Click on the "Create a new machine" button
- Select the
Edit
build steps - Add a new step -> Custom Node
- Search for
ComfyUI_HF_Inference
and select it - Close the build step dialig and then click on the "Save" button to rebuild the machine
Hugging Face hosted inference nodes for ComfyUI
Unofficial ComfyUI nodes for Hugging Face's inference API
Visit the official docs for an overview of how the HF inference endpoints work
Find models by task on the official website
Installation
Clone and install dependencies
git clone https://github.com/bitaffinity/ComfyUI_HF_Inference custom_nodes/ComfyUI_HF_Inference
cd custom_nodes/ComfyUI_HF_Inference
pip install -r requirements.txt
Export HF_AUTH_TOKEN with one of your Hugging Face tokens
Run ComfyUI
HF_AUTH_TOKEN=hf_1111111111111111111111111111111111 python main.py
Nodes
[!WARNING] Inference API (serverless) requires a model 10GB or below and fails for random reasons on different models.
Text
- Feature Extraction
- Question Answering
- Translation
- Generation
Image
- Classification
- Object Detection
- Segmentation
- TextToImage