Nodes Browser

ComfyDeploy: How Ostris Nodes ComfyUI works in ComfyUI?

What is Ostris Nodes ComfyUI?

This is a collection of custom nodes for ComfyUI that I made for some QOL. I will be adding much more advanced ones in the future once I get more familiar with the API.

How to install it in ComfyDeploy?

Head over to the machine page

  1. Click on the "Create a new machine" button
  2. Select the Edit build steps
  3. Add a new step -> Custom Node
  4. Search for Ostris Nodes ComfyUI and select it
  5. Close the build step dialig and then click on the "Save" button to rebuild the machine

Ostris Nodes ComfyUI

This is a collection of custom nodes for ComfyUI that I made for some QOL. I will be adding much more advanced ones in the future once I get more familiar with the API.

Installation

Just git clone this repo into your ComfyUI custom_nodes folder and restart.

cd <your_comfyui_folder>/custom_nodes
git clone https://github.com/ostris/ostris_nodes_comfyui.git
cd ostris_nodes_comfyui
pip install --upgrade -r requirements.txt

Current Nodes

General

  • One Seed - A universal seed node with numerous output formats
    • seed (SEED)
    • int (int)
    • number (NUMBER)
    • float (FLOAT)
    • string (STRING)
    • zfill (STRING) - zero filled to 16 digits
  • Text Box - Just a simple textbox for now
    • string (STRING)
    • text (TEXT)

LLM Prompt Upsampling (BETA)

Based on the fantastic work of sayakpaul/caption-upsampling. It uses an LLM to expand your prompt into a more complex prompt with more descriptive detail.

WARNING: This is experimental. It will likely remove TI embeddings. It loads in 4bit mode but will still be VRAM hungry.

  • LLM Pipe Loader - Loads the LLM pipeline to use
    • model_name (STRING) Huggingface model name e.g. HuggingFaceH4/zephyr-7b-beta
  • LLM Prompt Upsampling - Upsamples a prompt

Example: <img src="https://raw.githubusercontent.com/ostris/ostris_nodes_comfyui/main/assets/prompt_upsampling_demo.jpg" width="768" height="auto">