Nodes Browser

ComfyDeploy: How ComfyUI-NSFW-Detection works in ComfyUI?

What is ComfyUI-NSFW-Detection?

An implementation of NSFW Detection for ComfyUI

How to install it in ComfyDeploy?

Head over to the machine page

  1. Click on the "Create a new machine" button
  2. Select the Edit build steps
  3. Add a new step -> Custom Node
  4. Search for ComfyUI-NSFW-Detection and select it
  5. Close the build step dialig and then click on the "Save" button to rebuild the machine

This is an implementation of NSFW Detection for ComfyUI

Project Overview

This project is designed to detect whether images generated by ComfyUI are Not Safe For Work (NSFW). It uses a machine learning model to classify images as either safe or not safe for work. If an image is classified as NSFW, an alternative image is returned.

Install

  1. Clone this repo into custom_nodes directory of ComfyUI location

  2. Run pip install -r requirements.txt

Usage

The main functionality of the project is encapsulated in the NSFWDetection class in the node.py file. This class has a run method that takes three parameters:

  • image: The image to be classified.
  • score: The threshold score for classifying an image as NSFW.
  • alternative_image: The image to be returned if the input image is classified as NSFW.

Example

<img src="images/example.png" raw=true>

https://github.com/trumanwong/ComfyUI-NSFW-Detection/blob/main/workflow.json

Contributing

Contributions are welcome. Please submit a pull request if you have any improvements or bug fixes.

License

This project is licensed under the terms of the MIT license.