ComfyDeploy: How Comflowy's Custom Nodes works in ComfyUI?

What is Comflowy's Custom Nodes?

Custom nodes for ComfyUI by Comflowy.

How to install it in ComfyDeploy?

Head over to the machine page

  1. Click on the "Create a new machine" button
  2. Select the Edit build steps
  3. Add a new step -> Custom Node
  4. Search for Comflowy's Custom Nodes and select it
  5. Close the build step dialig and then click on the "Save" button to rebuild the machine
<img src="images/comflowy_banner.png" alt="banner"/>

Comflowy ComfyUI Extension

<div>

Version <a href="https://discord.gg/cj623WvcVx"> <img src="https://dcbadge.vercel.app/api/server/cj623WvcVx?style=flat" /> </a>

中文说明

</div>

When using ComfyUI, most of the models we use are open-source models. Some closed-source models with excellent results cannot be used in ComfyUI. To solve this problem, we developed the Comflowy extension. We hope to integrate these high-quality closed-source models into ComfyUI, so that users can chain together various closed-source models through ComfyUI.

I. Node List

  1. Comflowy LLM Node: This is a node that calls LLM. You can use it to implement functions similar to Prompt Generator. Unlike other LLM nodes on the market, it obtains results by calling APIs, which means you don't need to install Ollama to call LLM models. No need to worry about whether your computer configuration is sufficient to run these LLM models. It's also free.
    • You can use our online version of Comflowy to run workflows containing this node.

    • You can also download the workflow file and import it into ComfyUI for use.

    • <details> <summary>Workflow Screenshot</summary> <br/>

      image

    </details>
  2. Comflowy Omost Node: The Omost extension is a extension that helps you write prompts, but running this extension locally requires a computer with higher configuration. Based on our understanding of Omost, we implemented a similar node, but slightly different in that we didn't run Omost's official model, but implemented it through Prompt Engineering. This way, the running speed will be faster.
    • Online version workflow.

    • Local version workflow file.

    • <details> <summary>Workflow Screenshot</summary> <br/>

      image

    </details>
  3. Comflowy Flux Node: Flux is a node that can generate images with Flux Pro. Flux Pro is a non open source model, so in most cases, you cannot use this model in ComfyUI. To solve this problem, we developed this node, which allows you to generate images directly in ComfyUI. But please note that this model is a commercial model, so each use will deduct your credits.
    • Online version App.

    • <details> <summary>Workflow Screenshot</summary> <br/>

      image

    </details>
  4. Comflowy Ideogram Node: Ideogram is a node that can generate images with Ideogram. Similar to Flux, Ideogram is a non open source model, so in most cases, you cannot use this model in ComfyUI. To solve this problem, we developed this node, which allows you to generate images directly in ComfyUI. But please note that this model is a commercial model, so each use will deduct your credits.
    • Online version App.

    • <details> <summary>Workflow Screenshot</summary> <br/>

      image

    </details>
  5. Comflowy Clarity Upscale Node: This is a node that can upscale images. This node is claimed to be a replacement for Magnific. The overseas influencer developer levlsio praised this model.
    • Online version App.

    • <details> <summary>Workflow Screenshot</summary> <br/>

      image

    </details>

II. Price

| Node | Price | | --- | --- | | LLM | Free | | Omost | Free | | Flux | Flux-1.1-pro costs approximately 400 credit per image. Flux-pro costs 550 credits per image. | | Ideogram | Ideogram-v2-turbo costs approximately 800 credit per image. Ideogram-v2 costs 500 credits per image. | | Clarity Upscale | This model costs approximately 500 credits per image, but this varies depending on your inputs. |

III. How to Use

[!NOTE] It should be noted that when using the Comflowy extension, there may be situations where it cannot be used normally due to network problems. If you encounter an error like Failed to get response from LLM model with https://app.comflowy.com/api/open/v0/prompt, you need to check your network status.

<details> <summary>Step 1: Install Comflowy ComfyUI Extension</summary>
  • Method 1: Install using ComfyUI Manager (recommended)

  • Method 2: Git installation

    Open a cmd window in the CompyUI extension directory (e.g., "CompyUI\custom_nodes") and type the following command:

    git clone https://github.com/6174/comflowy-nodes.git
    
  • Method 3: Download zip file

    Or download and unzip the zip file, copy the resulting folder to the ComfyUI\custom_nodes\ directory.

</details> <details> <summary>Step 2: Obtain Comflowy API Key</summary>

Next, you need to obtain the Comflowy API Key. Click on the avatar in the bottom left corner (Figure ①), then click on Settings (Figure ②), and finally find the API Key (Figure ③) and copy it. Note: For security reasons in future use, please do not disclose your API Key to others.

image

</details> <details> <summary>Step 3: Enter Comflowy API Key</summary>

Lastly, you need to input the API Key into the Comflowy Set API Key node. After entering it, you can delete this node. Then you can use other Comflowy nodes. If you don't input this node, you won't be able to use Comflowy nodes.

image

</details>

IV. Update Log

  • V0.2: Added Flux node, Ideogram node.
  • V0.1: Support for LLM node, Omost node, Http node.

V. Acknowledgements

  1. Thanks to SiliconFlow for providing free LLM services.
  2. Thanks to the author of Omost and the author of the ComfyUI-Omost extension.
  3. Thanks to all who contributed to this open source project:
<a href="https://github.com/6174/comflowy-nodes/graphs/contributors"> <img src="https://contrib.rocks/image?repo=6174/comflowy-nodes" /> </a>