Nodes Browser

ComfyDeploy: How ComfyUI-Tara-LLM-Integration works in ComfyUI?

What is ComfyUI-Tara-LLM-Integration?

Tara is a powerful node for ComfyUI that integrates Large Language Models (LLMs) to enhance and automate workflow processes. With Tara, you can create complex, intelligent workflows that refine and generate content, manage API keys, and seamlessly integrate various LLMs into your projects.

How to install it in ComfyDeploy?

Head over to the machine page

  1. Click on the "Create a new machine" button
  2. Select the Edit build steps
  3. Add a new step -> Custom Node
  4. Search for ComfyUI-Tara-LLM-Integration and select it
  5. Close the build step dialig and then click on the "Save" button to rebuild the machine

Tara - ComfyUI Node for LLM Integration

<p align="center"> <img src="logo/tara-logo.webp" alt="Tara Logo" width="200"/> </p>

Introduction

Tara is a powerful node for ComfyUI that integrates Large Language Models (LLMs) to enhance and automate workflow processes. With Tara, you can create complex, intelligent workflows that refine and generate content, manage API keys, and seamlessly integrate various LLMs into your projects.

Currently, Tara supports OpenAI and Grok (all models), with plans to expand support to together.ai and Replicate. Note: Mixtral-8x7b-32768 from Groq is quite good anad free to use as of now.

Features

Tara comprises four main nodes:

  • TaraPrompterAdvancedNode: This is a Universal Node for all OpenAI compatible APIs, this will be the design going forward, it takes llm_config as an input, which is generated by TaraLLMConfigNode abd TaraPresetLLMConfigNode
  • TaraApiKeySaver: Provides a secure way to save and store API keys internally.
  • TaraLLMConfigNode: It takes OpenAI-compatible API config and outputs llm_config which can be connected to TaraPrompterAdvancedNode and TaraAdvancedCompositionNode
  • TaraAdvancedCompositionNode: I believe Composition is a better and more apt name than Daisy Chain and going forward, it will be the name. Multiple texts can be composed this way.
  • TaraPresetLLMConfigNode: This node is responsible for using predefined templates for OpenAI and Groq
  • TaraApiKeyLoader: Manages and loads saved API keys for different LLM services.

Deprecated

  • TaraPrompter (phasing out): Utilizes input guidance to generate refined positive and negative outcomes.
  • TaraDaisyChainNode: Enables complex workflows by allowing outputs to be daisy-chained into subsequent prompts, facilitating intricate operations like checklist creation, verification, execution, evaluation, and refinement.

Currently, Tara supports OpenAI and Grok (all models), with plans to expand support to together.ai and Replicate.

Installation

Tara can be installed using one of the following methods:

Method 1:

  1. Navigate to the custom_nodes directory inside the ComfyUI root folder, typically found at /workspace/ComfyUI/custom_nodes in Runpod.
  2. Clone this repository using git clone https://github.com/ronniebasak/ComfyUI-Tara-LLM-Integration.git .
  3. Restart ComfyUI and reload your browser to apply changes.

Method 2: (Prerequisite: ComfyUI manager)

  1. Open ComfyUI Manager.
  2. Click "Install via Git URL".
  3. Paste the link of this repository ( https://github.com/ronniebasak/ComfyUI-Tara-LLM-Integration.git ).
  4. Restart ComfyUI and reload your browser to apply changes.

Usage

TaraPrompter

  • Input: Guidance, Positive, Negative
  • Output: Refined Positive, Refined Negative

TaraApiKeyLoader

Loads and provides API keys for other Tara nodes.

TaraApiKeySaver

Securely saves API keys used by Tara nodes.

TaraDaisyChainNode

  • Input: Guidance, Prompt, Positive, Negative
  • Output: Generated text, suitable for chaining in workflows.

Working with the Workflow

  1. Add the TaraApiKeySaver to an empty workflow
  2. Select openai and enter your openai api key (trial credits available for new users). Queue the prompt. Nothing will happen but it should update config, will work on providing confirmation.
  3. Select groq and enter your groq api key (free to obtain as of writing). Queue the prompt.
  4. Clear workflow.
  5. In a new Workflow, use TaraApiKeyLoader and it should be able to fetch the keys previously configured. You can also use a Text _O or Primitive node to enter the API key directly in an workflow and connect that to TaraPrompter or TaraDaisyChainNode

Future Plans

  • Integration with together.ai for collaborative LLM usage.
  • Support for Replicate, enabling access to a broader range of AI models.

Contributing

We welcome contributions to Tara! If you have suggestions or improvements, please fork the repository and submit a pull request.

Disclaimer

Each LLM and Service (even Open Source ones) have their respective ToS, before using it with tara, you are agreeing with all of their ToS.

When using Tara with a hosted ComfyUI service, there is an option to save the API key temporarily. This action stores the key in the /tmp directory, which is typically auto-deleted on Linux-based systems. However, this might pose a risk if the server is shared among multiple users, as one person's API key could potentially overwrite another's. As Tara is in its alpha version, addressing bugs is our priority. In the meantime, users may opt to input the API key as a text (or primitive) node; this method ensures the API key is never saved on the server. When using a primitive node, remember to collapse it during screen recording or sharing to protect your API key.

License

Tara is released under the GNU General Public License v3.0 (GPLv3).

Star History

Star History Chart