Nodes Browser
ComfyDeploy: How NSFW Check for ComfyUI works in ComfyUI?
What is NSFW Check for ComfyUI?
This project is designed to detect whether images generated by ComfyUI are Not Safe For Work (NSFW). It uses a machine learning model to classify images as either safe or not safe for work and returns a confidence score for the NSFW classification.
How to install it in ComfyDeploy?
Head over to the machine page
- Click on the "Create a new machine" button
- Select the
Edit
build steps - Add a new step -> Custom Node
- Search for
NSFW Check for ComfyUI
and select it - Close the build step dialig and then click on the "Save" button to rebuild the machine
NSFW Check for ComfyUI
Project Overview
This project is designed to detect whether images generated by ComfyUI are Not Safe For Work (NSFW). It uses a machine learning model to classify images as either safe or not safe for work and returns a confidence score for the NSFW classification.
Using the score, a user can add logical filters to their workflow.
A threshold of 0.95 works well for most cases.
Usage
While one can use this node to quickly check the NSFW score of an image, it is more likely to be used when you are using workflow as an API. Using the output from this node, you can programmatically filter out NSFW images and have dynamic thresholds.
Ouput is a list of scores for the batch of images.
Credits
This project is based on ComfyUI-NSFW-Detection by trumanwong.
Install
-
Clone this repo into custom_nodes directory of ComfyUI location
-
Run pip install -r requirements.txt