Nodes Browser

ComfyDeploy: How ComfyUI Dwpose TensorRT works in ComfyUI?

What is ComfyUI Dwpose TensorRT?

This project provides a Tensorrt implementation of Dwpose for ultra fast pose estimation inside ComfyUI

How to install it in ComfyDeploy?

Head over to the machine page

  1. Click on the "Create a new machine" button
  2. Select the Edit build steps
  3. Add a new step -> Custom Node
  4. Search for ComfyUI Dwpose TensorRT and select it
  5. Close the build step dialig and then click on the "Save" button to rebuild the machine
<div align="center">

ComfyUI Dwpose TensorRT

python cuda trt by-nc-sa/4.0

</div> <p align="center"> <img src="assets/demo.PNG" /> </p>

This project provides a Tensorrt implementation of Dwpose for ultra fast pose estimation inside ComfyUI

This project is licensed under CC BY-NC-SA, everyone is FREE to access, use, modify and redistribute with the same license.

For commercial purposes, please contact me directly at yuvraj108c@gmail.com

If you like the project, please give me a star! ⭐


⏱️ Performance

Note: The following results were benchmarked on FP16 engines inside ComfyUI, using 1000 similar frames

| Device | FPS | | :----: | :-: | | L40s | 20 |

🚀 Installation

Navigate to the ComfyUI /custom_nodes directory

git clone https://github.com/yuvraj108c/ComfyUI-Dwpose-Tensorrt
cd ./ComfyUI-Dwpose-Tensorrt
pip install -r requirements.txt

🛠️ Building Tensorrt Engine

  1. Download the following onnx models:

  2. Build tensorrt engines for both of these models by running:

    • python export_trt.py
  3. Place the exported engines inside ComfyUI /models/tensorrt/dwpose directory

☀️ Usage

  • Insert node by Right Click -> tensorrt -> Dwpose Tensorrt

🤖 Environment tested

  • Ubuntu 22.04 LTS, Cuda 12.4, Tensorrt 10.2.0.post1, Python 3.10, L40s GPU
  • Windows (Not tested, but should work)

👏 Credits

License

Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0)